Вы находитесь на странице: 1из 249

Study supporting the

evaluation of the
Research Executive
Agency
(2015-2018)
Final Report

Written by PPMI, CSES and IDEA Consulting EN


February – 2019
Study supporting the evaluation of the Research Executive Agency (2015-2018)

European Commission
Directorate-General for Research and Innovation
Directorate B — Common Implementation Centre
Unit B.5 — Executive Agencies & Funding Bodies
E-mail RTD-B5-EXECUTIVE-AGENCIES@ec.europa.eu
RTD-PUBLICATIONS@ec.europa.eu

European Commission
B-1049 Brussels

Manuscript completed in February 2019


This document has been prepared for the European Commission, however it reflects the views only of the authors, and the
European Commission is not liable for any consequence stemming from the reuse of this publication.

More information on the European Union is available on the internet (http://europa.eu).

Luxembourg: Publications Office of the European Union, 2020

PDF ISBN: 978-92-76 02422-4 doi: 10.2777/299516 Catalogue number: KI-02-19-257-EN-N

© European Union, 2020

The reuse policy of European Commission documents is implemented based on Commission Decision 2011/833/EU of 12
December 2011 on the reuse of Commission documents (OJ L 330, 14.12.2011, p. 39). Except otherwise noted, the reuse of
this document is authorised under a Creative Commons Attribution 4.0 International (CC-BY 4.0) licence
(https://creativecommons.org/licenses/by/4.0/). This means that reuse is allowed provided appropriate credit is given and any
changes are indicated.

For any use or reproduction of elements that are not owned by the European Union, permission may need to be sought directly
from the respective rightholders.

Image credits:
Cover page image: © Lonely # 46246900, ag visuell #16440826, Sean Gladwell #6018533, LwRedStorm #3348265, 2011;
kras99 #43746830, 2012. Source: Fotolia.com.
EUROPEAN COMMISSION

Study supporting the


evaluation of the Research
Executive Agency
(2015-2018)
Final Report

Written by PPMI, CSES and IDEA Consulting

Directorate-General for Research and Innovation


2020 The operation of the Research Executive Agency (2015-2018) EN
TABLE OF CONTENTS

TABLE OF CONTENTS ............................................................................................. 2


LIST OF FIGURES .................................................................................................. 4
LIST OF TABLES .................................................................................................... 6
LIST OF ABBREVIATIONS ....................................................................................... 8
1 INTRODUCTION .............................................................................................10
2 OBJECTIVES AND METHODOLOGY....................................................................10
2.1 Study objectives and scope ..........................................................................10
2.2 Methodology ..............................................................................................11
2.2.1 Overall framework for the evaluation of REA ...........................................11
2.2.2 Methodology .......................................................................................12
3 BACKGROUND: REA’S REGULATORY FRAMEWORK, MISSION AND GOVERNANCE
SYSTEM ..............................................................................................................14
3.1 The regulatory framework of setting up the Executive Agencies ........................14
3.2 REA history, mission and governance ............................................................15
3.3 Overview of REA’s performance during 2015-2018..........................................18
4 EVALUATION RESULTS: THE ASSESSMENT OF REA’S PERFORMANCE IN 2015-2018
22
4.1 Effectiveness, efficiency and coherence .........................................................22
4.1.1 Effectiveness .......................................................................................22
4.1.2 Efficiency ............................................................................................42
4.1.3 Coherence ..........................................................................................64
4.2 Results of the retrospective cost-benefit analysis ............................................82
4.2.1 Background of the quantitative CBA .......................................................82
4.2.2 Actual staffing and costs of REA .............................................................87
4.2.3 Cost-effectiveness of the Executive Agency scenario and actual savings due
to externalisation ..............................................................................................90
4.2.4 Workload analysis ................................................................................93
4.2.5 Qualitative aspects of the CBA ............................................................. 101
4.2.6 Recommendations on improving the quality of future cost–benefit analysis
104
5 CONCLUSIONS AND RECOMMENDATIONS ....................................................... 105
5.1 Overall conclusions ................................................................................... 105
5.2 Specific conclusions and recommendations .................................................. 109
5.2.1 Effectiveness of REA ........................................................................... 109
5.2.2 Efficiency of REA ................................................................................ 112
5.2.3 Coherence of REA .............................................................................. 114
5.2.4 Retrospective CBA ............................................................................. 116
REFERENCES ..................................................................................................... 118
ANNEXES .......................................................................................................... 120
ANNEX 1 : KEY ACTIVITIES UNDERTAKEN DURING THE EVALUATION ....................... 120
Documentary review and desk research ....................................................... 120
Interview programme .................................................................................. 122
In-depth study areas .................................................................................... 122
Surveys ........................................................................................................ 123
Cost–benefit analysis ................................................................................... 125
Focus group ................................................................................................. 126
Benchmarking .............................................................................................. 126
ANNEX 2: STAKEHOLDER CONSULTATION SYNTHESIS REPORT................................ 128
Consultation methods and target groups ......................................................... 128
Summary of the survey results ........................................................................ 129
Summary of the findings from the interview programme ................................ 135
Summary of the focus group results ................................................................ 140
ANNEX 3: INTERVIEW QUESTIONNAIRE ................................................................ 142
Interview questionnaires: interviews with EC and REA officials ...................... 142
Interview questionnaire: interviews with REA’s beneficiaries ......................... 153
ANNEX 4: FINAL SURVEY QUESTIONNAIRES .......................................................... 155
The questionnaire of the survey of REA beneficiaries: Horizon 2020 ............... 155

2
The questionnaire of the survey of unsuccessful REA applicants ..................... 166
The questionnaire of the survey of REA independent experts: monitoring
experts & evaluators ....................................................................................... 172
The questionnaire of the survey of EC officials ................................................ 183
ANNEX 5: IN-DEPTH STUDY AREAS ...................................................................... 196
Assessment of REA’s coherence, separation of tasks/roles between the
Commission and the Agency, as well as maintenance of the know-how within
the Commission ............................................................................................... 196
Assessment of management and provision of central support services ........... 206
Assessment of the newly introduced key business processes & efficiency gains
achieved .......................................................................................................... 214
Assessment of REA’s key success stories and lessons learned during 2015-2018
........................................................................................................................ 219
ANNEX 6 : QUANTITATIVE BENCHMARKING OF THE COMMISSION’S EXECUTIVE
AGENCIES ......................................................................................................... 227
ANNEX 7: DATA RELATED TO RETROSPECTIVE CBA ................................................ 233

3
LIST OF FIGURES

FIGURE 1. INTERVENTION LOGIC ON THE OPERATIONS OF REA. ...............................11


FIGURE 2. ORGANISATIONAL MODEL FOR THE EVALUATION OF REA. .........................12
FIGURE 3. THE GOVERNANCE STRUCTURE OF REA AND ITS PARENT DGS. ..................17
FIGURE 4. REA‘S PORTFOLIO OF RUNNING PROJECTS, FROM 2012 TO MID-2018.........20
FIGURE 5. EC OFFICIALS’ ASSESSMENT OF THE POLICY COHERENCE AND UNIFIED
COMMUNICATION BETWEEN THEM AND THE AGENCY. ..............................................28
FIGURE 6. THE EXTENT TO WHICH THE SURVEYED EC OFFICIALS AGREED WITH THE
FOLLOWING STATEMENTS REGARDING THE INPUTS PROVIDED TO THEM OR THEIR UNIT
BY REA. ...............................................................................................................29
FIGURE 7. THE EXTENT TO WHICH THE SURVEYED EC OFFICIALS WERE SUFFICIENTLY
INFORMED BY REA ABOUT THE PROGRESS OF THE EVALUATIONS UNDER THEIR CALL OR
RESEARCH TOPICS. ..............................................................................................30
FIGURE 8. EC OFFICIALS’ ASSESSMENT OF THE EXTENT TO WHICH REA IMPLEMENTED A
PROCESS WHICH ENSURED THAT THE PROPOSALS BEST ADDRESSING THE SPECIFIC
RESEARCH TOPICS UNDER THEIR AREAS OF RESPONSIBILITY WERE SELECTED FOR
FUNDING, AS SET OUT IN THE H2020 RULES FOR PARTICIPATION AND FURTHER
DETAILED IN THE VADEMECUM. .............................................................................32
FIGURE 9. THE LEVEL OF SATISFACTION WITH THE FREQUENCY OF INVITATIONS TO
ATTEND THE PROJECT REVIEW MEETINGS FOR PROJECTS FUNDED UNDER THEIR
PROGRAMME OR RESEARCH TOPICS DURING 2015-2018 AMONG THE SURVEYED EC
OFFICIALS. ..........................................................................................................33
FIGURE 10. OVERALL SATISFACTION OF BENEFICIARIES, UNSUCCESSFUL APPLICANTS,
AND EC OFFICIALS WITH THE PERFORMANCE OF THE AGENCY. .................................35
FIGURE 11. WILLINGNESS OF BENEFICIARIES, UNSUCCESSFUL APPLICANTS AND
EXTERNAL EXPERTS TO WORK WITH THE AGENCY AGAIN IN THE FUTURE. .................36
FIGURE 12. SATISFACTION OF BENEFICIARIES WITH REA’S EXTERNAL
COMMUNICATION/RESPONSIVENESS. .....................................................................38
FIGURE 13. THE EXTENT TO WHICH THE FOLLOWING COMMUNICATION CHANNELS
USED BY REA PROVIDED THE AGENCY’S BENEFICIARIES WITH RELEVANT, HELPFUL
INFORMATION WHEN THEY NEEDED IT. ..................................................................39
FIGURE 14. INDEPENDENT EXPERTS’ ASSESSMENT OF THE USEFULNESS OF
INFORMATION PROVIDED ONLINE. .........................................................................39
FIGURE 15. SATISFACTION OF INDEPENDENT EXPERTS WITH REA’S EXTERNAL
RESPONSIVENESS AND COMPETENCE. ....................................................................40
FIGURE 16. BENEFICIARIES’ VIEWS REGARDING VISIBILITY OF EU AS PROMOTER OF
THE PROGRAMMES ENTRUSTED TO REA. .................................................................41
FIGURE 17. REA’S PERFORMANCE IN TERMS OF THE AVERAGE TTG, 2010-2018. .........43
FIGURE 18. REA’S PERFORMANCE IN TERMS OF TTG (THE SHARE OF GRANTS
CONCLUDED WITHIN THE TTG TARGET) PER ACTIVITY AND YEAR OF THE CALL, 2014-
2017. ..................................................................................................................44
FIGURE 19. SATISFACTION OF BENEFICIARIES WITH THE PERFORMANCE OF REA IN
RELATION TO TIMELINESS OF THE EVALUATION, SELECTION AND CONTRACTING
PROCESS.............................................................................................................44
FIGURE 20. SATISFACTION OF BENEFICIARIES WITH THE APPLICATION PROCESS. .....45
FIGURE 21. SATISFACTION OF BENEFICIARIES WITH THE EVALUATION PROCESS. ......46
FIGURE 22. SATISFACTION OF EC OFFICIALS WITH THE EVALUATION PROCESS
ORGANISED BY REA. ............................................................................................47
FIGURE 23. SHARE OF EVALUATION REVIEW/REDRESS CASES (%)............................47
FIGURE 24. SATISFACTION OF BENEFICIARIES WITH THE CONTRACTING PROCESS. ...48
FIGURE 25. REA’S PERFORMANCE IN TERMS OF THE AVERAGE TTP, 2013-2018...........49
FIGURE 26. SATISFACTION OF BENEFICIARIES WITH THE PERFORMANCE OF REA IN
RELATION TO TIMELINESS OF THE PAYMENT PROCESS. ............................................50
FIGURE 27. REA’S PERFORMANCE IN TERMS OF THE AVERAGE TTA, 2014-2018. .........51
FIGURE 28. SATISFACTION OF BENEFICIARIES WITH THE PERFORMANCE OF REA IN
RELATION TO THE GRANT AMENDMENT PROCESS. ...................................................51
FIGURE 29. SATISFACTION OF BENEFICIARIES WITH THE REPORTING AND MONITORING
PROCESSES. ........................................................................................................52
4
FIGURE 30. SHARE OF CLOSED PROJECTS WHICH REACHED ALL OR MOST OF THEIR
OBJECTIVES IN 2012-2018. ...................................................................................53
FIGURE 31. FP7 RESIDUAL ERROR RATE IN 2014-2018. ............................................54
FIGURE 32. RATIO BETWEEN REA’S ADMINISTRATIVE AND OPERATIONAL BUDGET. ....55
FIGURE 33. COSTS AND SAVINGS OF THE EXECUTIVE AGENCY SCENARIO IN 2015-2018,
EUR. ...................................................................................................................92
FIGURE 34. SFS ESTIMATED AND REA’S ACTUAL CUMULATIVE OPERATIONAL BUDGET
2014-2018, EUR MILLION. .....................................................................................95
FIGURE 35. FRAMEWORK OF THE AGENCY'S PERFORMANCE. ................................... 106
FIGURE 36. OVERALL SATISFACTION OF BENEFICIARIES, APPLICANTS, AND EC
OFFICIALS WITH THE PERFORMANCE OF THE AGENCY. ........................................... 112
FIGURE 37. ESTIMATED COSTS AND SAVINGS OF THE EXECUTIVE AGENCY SCENARIO IN
2015-2018, EUR. ................................................................................................ 117
FIGURE 38. STRUCTURED MECHANISMS FOR POLICY FEEDBACK. ............................ 200
FIGURE 39. THE EXTENT TO WHICH EXPERT EVALUATORS AGREED WITH THE RELATED
STATEMENTS ON THE REGISTRATION PROCESS. ................................................... 206
FIGURE 40. THE EXTENT TO WHICH EXPERT EVALUATORS AGREED WITH THE RELATED
STATEMENTS ON THE SELECTION AND CONTRACTING PROCESS. ............................ 207
FIGURE 41. SHARE OF EXPERT EVALUATORS WHO STRONGLY OR RATHER AGREED WITH
THE RELATED STATEMENTS ON THE PERFORMANCE OF THEIR ASSIGNMENTS AND
TASKS. ............................................................................................................. 208
FIGURE 42. THE EXTENT TO WHICH EXPERT EVALUATORS AGREED WITH THE RELATED
STATEMENTS ON THE PAYMENTS MADE BY REA. .................................................... 209
FIGURE 43. BENEFICIARIES‘ SATISFACTION WITH THE SERVICES OF REA’S RESEARCH
ENQUIRY SERVICE. ............................................................................................. 210
FIGURE 44. SATISFACTION OF BENEFICIARIES WITH THE PERFORMANCE OF REA IN
RELATION TO TIMELINESS OF THE EVALUATION, CONTRACTING AND GRANT
MANAGEMENT PROCESSES. ................................................................................. 215
FIGURE 45. SATISFACTION OF BENEFICIARIES WITH THE VARIOUS ASPECTS OF
APPLICATION, CONTRACTING AND GRANT MANAGEMENT PROCESSES. .................... 216
FIGURE 46. SATISFACTION OF BENEFICIARIES WITH THE IT TOOLS USED FOR
APPLICATION, CONTRACTING AND GRANT MANAGEMENT PROCESSES. .................... 217
FIGURE 47. ORGANISATIONAL CHART OF REA (AS OF APRIL 2018). ......................... 220

5
LIST OF TABLES

TABLE 1. TARGETING AND COVERAGE OF THE SURVEY PROGRAMME, INCLUDING KEY


CHARACTERISTICS OF EACH SURVEY SAMPLE. ........................................................13
TABLE 2. REA‘S EVOLUTION OF ACTIVITIES FROM 2012 TO MID-2018........................19
TABLE 3. OPERATIONAL AND ADMINISTRATIVE BUDGETS OF REA WITH THE COST-
EFFECTIVENESS OF CONTROLS FROM 2012 TO 2017. ...............................................19
TABLE 4. BREAKDOWN OF REA’S RUNNING PROJECTS, FROM 2015 TO 2018. ..............20
TABLE 5. OVERVIEW OF KEY CENTRAL SUPPORT SERVICES PROVIDED BY REA BETWEEN
2014 AND MID-2018. ............................................................................................21
TABLE 6. REA’S PERFORMANCE IN TERMS OF TTP, 2012-2018. ..................................49
TABLE 7. OVERVIEW OF THE FREQUENCY, QUALITY AND UPTAKE OF VARIOUS POLICY
OUTPUTS PROVIDED BY REA TO THE PARENT DGS. ..................................................69
TABLE 8. OVERVIEW OF THE DIFFERENT DELEGATION SCENARIOS. ...........................83
TABLE 9. BUDGET MANAGED AND HUMAN RESOURCES IN REA COMPARED TO ALL
EXECUTIVE AGENCIES IN 2013 AND 2020. ..............................................................85
TABLE 10. ASSUMPTIONS USED IN THE EX ANTE CBA AND SFS. ................................85
TABLE 11. ACTUAL AND ESTIMATED REA’S ADMINISTRATIVE BUDGET, EUR MILLION. ..88
TABLE 12. ACTUAL AND ESTIMATED NUMBER OF REA STAFF. ....................................90
TABLE 13. ESTIMATED “ACTUAL” COMMISSION STAFF COSTS AND OVERHEADS, EUR. .91
TABLE 14. SFS ESTIMATED AND REA’S ACTUAL OPERATIONAL BUDGET 2014-2018,
EUR MILLION .......................................................................................................93
TABLE 15. SFS ESTIMATED AND REA’S ACTUAL OPERATIONAL BUDGET 2014-2018 BY
PROGRAMME, EUR MILLION. ..................................................................................95
TABLE 16. AVERAGE GRANT SIZE IN 2014-2018 BY PROGRAMME, EUR MILLION. .........96
TABLE 17. CBA ESTIMATED AND ACTUAL NUMBER OF NEW GRANTS ACROSS DIFFERENT
PROGRAMMES AND ACTIONS MANAGED BY REA UNDER 2014-2018 CALLS. ................97
TABLE 18. CBA ESTIMATED AND ACTUAL NUMBER OF NEW GRANTS ACROSS DIFFERENT
PROGRAMMES AND ACTIONS MANAGED BY REA UNDER 2014-2018 CALLS. ................98
TABLE 19. CBA ESTIMATED AND ACTUAL WORKLOAD PARAMETERS RELATED TO THE
CENTRAL SUPPORT SERVICES PROVIDED BY REA. .................................................. 100
TABLE 20. QUALITATIVE AND QUANTITATIVE ASPECTS OF THE CBA AND THEIR
CORRESPONDENCE TO THE EVALUATION SECTIONS. ............................................. 102
TABLE 21. PERFORMANCE INDICATORS OF REA, 2014-2017, EUR MILLION OR
PERCENTAGE. .................................................................................................... 107
TABLE 22. SUMMARY OF SURVEY RESPONSE AND COMPLETION RATES. ................... 124
TABLE 23. THE INDICATORS FOR COMPARATIVE ANALYSIS AND BENCHMARKING. .... 126
TABLE 24. TYPE OF STAKEHOLDER CONSULTATION AND STAKEHOLDERS ENGAGED. . 128
TABLE 25. SUMMARY OF THE SURVEY RESULTS. .................................................... 129
TABLE 26. SUMMARY OF INTERVIEW RESULTS. ...................................................... 135
TABLE 27. SUMMARY OF MEETING/FOCUS GROUP RESULTS. ................................... 140
TABLE 28. EXECUTION OF PAYMENTS TO EXPERTS DURING 2014-2017. ................... 208
TABLE 29. HANDLING OF REQUESTS TO THE ENQUIRY SERVICE .............................. 210
TABLE 30. TARGETS RELATED TO TIME TO VALIDATE URF VALIDATION REQUESTS,
FROM 2014 TO MID-2018. ................................................................................... 211
TABLE 31. TARGETS RELATED TO SUPPORT FOR FINANCIAL CAPACITY ASSESSMENTS,
FROM 2014 TO MID-2018. ................................................................................... 212
TABLE 32. THE OVERVIEW OF REA’S KEY SUCCESS STORIES AND LESSONS LEARNED IN
2015-2018. ....................................................................................................... 219
TABLE 33. MAIN PERFORMANCE INDICATORS OF THE COMMISSION’S EXECUTIVE
AGENCIES, 2016. ............................................................................................... 227
TABLE 34. CBA AND HR INDICATORS OF THE COMMISSION’S EXECUTIVE AGENCIES,
2015-2017. ....................................................................................................... 229
TABLE 35. SAFISFACTION WITH THE PROCESSES OF APPLICATION AND GRANT
MANAGEMENT IN THE EXECUTIVE AGENCIES (REA, EACEA, CHAFEA AND ERCEA) ...... 231
TABLE 36. SFS ESTIMATED COSTS OF THE IN-HOUSE AND THE EXECUTIVE AGENCY
SCENARIOS, EUR. .............................................................................................. 234
TABLE 37. ACTUAL COSTS OF THE IN-HOUSE AND THE EXECUTIVE AGENCY SCENARIOS,
EUR. ................................................................................................................. 237
6
TABLE 38. ACTUAL REA’S OPERATIONAL BUDGET 2014-2018, EUR MILLION. ............ 241
TABLE 39. ACTUAL EEA/EFTA AND THIRD COUNTRIES’ CONTRIBUTIONS, EUR MILLION.
........................................................................................................................ 241
TABLE 40. ACTUAL EEA/EFTA CONTRIBUTIONS, EUR MILLION. ................................ 242
TABLE 41. ACTUAL THIRD COUNTRIES’ CONTRIBUTIONS, EUR MILLION. .................. 243
TABLE 42. ACTUAL THIRD COUNTRIES’ CONTRIBUTIONS, EUR MILLION. .................. 243
TABLE 43. ACTUAL EU BUDGET CONTRIBUTIONS, EUR MILLION. ............................. 244

7
LIST OF ABBREVIATIONS

AAR – Annual Activity Report


AMIF – Asylum, Migration and Integration Fund
AWP – Annual Work Programme
BPO – Business Process Owner
CAF – Common Assessment Framework
CA – Contract Agent
CAST – Contract Agents Selection Tool
CBA – Cost-Benefit Analysis
CHAFEA – Consumers, Health, Agriculture and Food Executive Agency
COFUND – Co-Funding of Regional, National and International Programmes
CoI – Conflict of Interest
COSME – Competitiveness of Small and Medium- Sized Enterprises
CRaS – Common Representative Audit Sample
CRS – Common Representative Sample
CSC – Common Support Centre
CSES – Centre for Strategy and Evaluation Services
DG – Directorate-General
DG AGRI – Directorate-General for Agriculture and Rural Development
DG BUDG – Directorate-General for Budget
DG CNECT – Directorate-General for Communications Networks, Content and Technology
DG CLIMA – Directorate-General for Climate Action
DG DIGIT – Directorate-General for Informatics
DG EAC – Directorate-General for Education, Youth, Sport and Culture
DG ECHO – Directorate-General for European Civil Protection and Humanitarian Aid
Operations
DG ENER – Directorate-General for Energy
DG ENV – Directorate-General for Environment
DG GROW – Directorate-General for Internal Market, Industry, Entrepreneurship and
SMEs
DG HOME – Directorate-General for Migration and Home Affairs
DG HR – Directorate-General for Human Resources and Security
DG JUST – Directorate-General for Justice and Consumers
DG MARE – Directorate-General for Maritime Affairs and Fisheries
DG MOVE – Directorate-General for Mobility and Transport
DG RTD – Directorate-General for Research and Information
DG SANTE – Directorate-General for Health and Food Safety
EA – Executive Agency
EACEA – Education, Audiovisual and Culture Executive Agency
EACI – Executive Agency for Competitiveness and Innovation
EAHC – Executive Agency for Health and Consumers
EASME – Executive Agency for Small and Medium-sized Enterprises
EC – European Commission
EEA – European Economic Area
EFTA – European Free Trade Organisation
EIPP – European Investment Project Portal
EMFF – European Maritime and Fisheries Fund
EMI – Expert Management Internal
EPSO – European Personnel Selection Office
ERCEA – European Research Council Executive Agency
EU – European Union
EUCI – EU classified information
EQ – Evaluation Question
FAQ – Frequently Asked Questions
FCA – Financial Capacity Assessment
FEL – Legal Status of Entities
FET-Open – Future and Emerging Technologies - Open
FP – Framework Programme
8
FTE – Full-Time Equivalent
FVC – Financial Viability Check
GIPs – General Implementing Provisions
GPSB – eGrants and eProcurement Steering Board
HR – Human Resource
IAJM – Inter-Agency Job Market
IAS – Internal Audit Service
ICF – Internal Control Framework
ICM – Internal Control Monitoring
IF – Individual Fellowship
INEA – Innovation and Networks Executive Agency
ISF – Internal Security Fund
ISFB – Internal Security Fund - Borders
ISFP – Internal Security Fund - Police
IT – Information Technology
ITN – Innovative Training Networks
KM – Knowledge Management
KPI – Key Performance Indicator
LEAR – Legal Entity Authorised Representatives
MFF – Multi-Annual Financial Framework
MoU – Memorandum of Understanding
MSCA – Marie Skłodowska-Curie actions
NCC – Network of Call Coordinators
NCP – National Contact Point
NEC – Network of Ethics Correspondents
NFO – Network of Financial Officers
NLO – Network of Legal Officers
NPO – Network of Project Officers
P4P – Project for Policy initiative
PC – Project Coordinator
PCOCO – Primary Coordinator Contact
PDM – Participant Data Management
PEAS – Proposal Expert Allocation System
PF – Policy Feedback
PFF – Policy Feedback Function
PFTF – Project Monitoring and Policy Feedback Task Force
PO – Project Officer
R&D&I – Research, Development and Innovation
R&I – Research & Innovation
RAO – Responsible Authorising Officer
RBN – Research Budgetary Network
REA – Research Executive Agency
REC – Rights, Equality and Citizenship
RES – Research Enquiry Service
RISE – Research and Innovation Staff Exchange
RTD – Research and Technological Development
RUF – Financial Services’ Network of the Commission
SC – Societal Challenge
SEDIA – Single Electronic Data Interchange Area
SEP – Proposal Submission and Evaluation Tool
SEWP – Spreading Excellence, Widening Participation
SFS – Specific Financial Statement
SME – Small and Medium-sized Enterprises
SwafS – Science with and for Society
TA – Temporary Agent
TEN-T – Innovation and Networks Executive Agency
TTA – Time To Amend
TTC – Time To Contract
TTG – Time To Grant
9
TTI – Time To Inform
TTP – Time To Pay
ToR – Tender Specifications
WP – Work Programme

1 Introduction

This report was prepared as a result of the study supporting the evaluation of the
Research Executive Agency (REA) (2015-2018) (specific contract No RTD/R4/PP–06941-
2018). The study was carried out by PPMI Group, the Centre for Strategy and Evaluation
Services (CSES) and IDEA Consult. The study started in July 2018 and finished in
February 2019.

The Final Report of the study was produced on the basis of requirements set out in the
Terms of Reference (henceforth – ToR), information gathered and analysed during the
execution of the project, as well as the results of project meetings with the Steering
Group.

The Final Report provides the results of the study, its conclusions and recommendations.
The report is divided into the following parts:

 Part 1: Introduction;

 Part 2: Objectives and Methodology of the study;

 Part 3: Background: REA’s regulatory framework, mission and governance;

 Part 4: Evaluation results: the assessment of REA’s performance;

 Part 5: Conclusions and recommendations.

Finally, seven annexes are attached to the report. Annex 1 describes key activities
implemented during the study. Annex 2 provides a synthesis report of stakeholder
consultation activities completed during the project. Annexes 3 and 4 contain the
interview and survey questionnaires used during our data collection. Annex 5 presents
four in-depth study areas. Annex 6 includes quantitative benchmarking of the
Commission’s Executive Agencies. Annex 7 provides the data related to the retrospective
CBA.

2 Objectives and Methodology

2.1 Study objectives and scope

The main objective of this study was to support the evaluation of the operation of REA.
The scope of the study was to evaluate the effectiveness, efficiency and coherence of the
implementation of parts of the European Union programmes by the Agency in the period
from 16 July 2015 to 15 July 2018. The study covered the actions and projects
funded during this part of the programming period as well as the legacy of the
programmes and actions funded until the end of FP7. The study, however, did not focus
on the operational achievements of these programmes, in particular the results of the
projects co-funded under the programmes.

The evaluation of the operation of REA is organised into four tasks:

1. Assessment of the regulatory framework, REA mission and governance;

2. Assessment of REA’s performance in 2015-2018;

10
3. Cost–benefit analysis (CBA);

4. Synthesis, conclusions and policy recommendations.

The assignment combines a retrospective and prospective analysis:

- A retrospective analysis concerns the regulatory and operational framework of REA


and its evolution during the evaluation period (Task 1), the assessment of REA’s
performance (Task 2), and CBA (Task 3) related to the Agency’s past activities;
- A prospective analysis will provide recommendations on the Agency’s operations
during the remaining part of the programming period 2014-2020 and the
implementation of new actions during the post-2020 programming period (Task 4).

The key findings, conclusions and recommendations of this study will inform internal
decision-making in the Agency and its parent DGs on possible improvements to the
implementation of the programmes delegated to the Agency. In addition, the study
results, which will be communicated to the European Parliament, the Council and the
European Court of Auditors, will be useful for accountability purposes.

2.2 Methodology

2.2.1 Overall framework for the evaluation of REA

The Tender Specifications make clear that the intervention logic underpinning the
establishment and the ongoing operations of REA should be produced in the form of a
diagram (or table) and included in the report. Therefore, we developed an intervention
logic on the operations of REA setting out context, objectives, inputs and processes, and
the outcomes (outputs, results and impacts) (see Figure 1 below).

Figure 1. Intervention logic on the operations of REA.


Source: compiled by PPMI based on desk research.

11
For the evaluation of the operation of REA, we also proposed using a holistic approach to
organisational analysis, which identifies the important elements of organisational
activities and general relationships among them. The organisational model includes a
number of organisational factors grouped into the following three sets (1) regulatory and
operational framework; (2) enablers; and (3) results (see Figure 2 below).

Figure 2. Organisational model for the evaluation of REA.


Source: adapted by PPMI from the CAF 2013 model.

The set of organisational factors relating to the regulatory and operational


framework of an Agency include the following elements: (1) the mandate and
responsibilities of an Agency; (2) its external governance (autonomy and supervision by
the parent DGs); as well as (3) its organisational structure. The enablers, which
determine what an organisation does and how it executes its tasks to achieve the desired
result, include the following elements: (4) strategy and planning; (5) people; (6) other
resources; (7) partnerships; and (8) processes of an Agency. Finally, the results of an
Agency are broken down into the following types of (intended and unintended) results:
(9) key performance results; (10) customer-oriented results; (11) people results; and
(12) policy results.

2.2.2 Methodology

Our stakeholder consultation strategy comprised three surveys, an extensive


interview programme, four in-depth study areas of the Agency’s performance
and a focus group.

Our survey programme involved three different surveys:

 Survey A: survey of REA’s beneficiaries and unsuccessful applicants;

 Survey B: survey of external experts contracted by REA;

 Survey C: survey of EC officials in the mirror units of the Commission (DG RTD; DG
GROW; DG EAC; DG CNECT; DG AGRI; DG HOME).

Survey A was addressed to all successful applicants/beneficiaries and unsuccessful


applicants for calls launched under the 2015-2018 AWPs (not including the legacy actions
of FP7). Survey B approached all external independent experts (including evaluators and
monitoring experts) contracted by the Agency between 2015 and 2018. Survey C
targeted over 80 EC officials in the mirror units of the Commission who have worked with
the Agency during the evaluation period.

12
Table 1. Targeting and coverage of the survey programme, including key characteristics
of each survey sample.

Target Types of Coverage Relevant characteristics


groups respondents/ of the sample
samples

Survey A: All applicants who Overall satisfaction with Around 85 % of potential


REA’s applied for REA grants REA’s performance (e.g. TTI, beneficiary respondents were
beneficiaries under the 2015-2018 TTC/TTG, TTP, etc.), H2020 MSCA contacts from
and AWPs, successful & quality/timeliness of the the host institutions who are
unsuccessful unsuccessful; filters services provided across its generally not in direct
applicants and routing were programmes, etc. Questions contact with REA and are less
applied to ensure that were aligned with the motivated to participate in
respondents would previous evaluation of the the survey programme.
receive only relevant Agency. Unsuccessful applicants are
questions. generally less willing to
participate in the survey
programme compared to the
Agency’s beneficiaries.

Survey B: External experts Overall satisfaction with External experts are


External contracted by REA REA’s performance and generally more willing to
experts between 2015 and quality/timeliness of the participate in the survey
contracted by 2018 (evaluation and services provided in business programme than any other
REA monitoring experts). processes and other activities respondent group.
in which external experts
were involved in the period
ranging from 2015 to 2018.
Questions were aligned with
the previous evaluation of
the Agency.

Survey C: EC Over 80 officials in Key topics: preparation of The sample for survey C
officials in the REA’s mirror units: DG the AWPs and call topics, included only those EC
parent DGs RTD; DG GROW; DG launch of the calls, officials who were most
EAC; DG CNECT; DG evaluation/selection of actively working with
AGRI; DG HOME. proposals, provision of policy REA/worked the longest
feedback and maintenance of period of time during
know-how in the 2015/2018.
Commission.

Source: prepared by PPMI.

Our interview programme included 46 interviews, including:

 31 interviews with the relevant EC and REA officials and other relevant stakeholders of
the Agency;

 15 interviews with REA’s beneficiaries.

The scope of in-depth study areas included the following issues:

 In-depth study area 1: a study on REA’s coherence, separation of tasks/roles


between the Commission and the Agency, as well as maintenance of the know-how
within the Commission;

 In-depth study area 2: effectiveness and efficiency of the provision of


administrative and logistical support activities, including management of independent
experts;

 In-depth study area 3: effectiveness and efficiency of the newly introduced key
business processes & efficiency gains achieved;
13
 In-depth study area 4: added value of REA, key success stories and lessons learned
from the 2015-2018 period.

A focus group was organised by PPMI involving participants from REA, the Commission
services and other key stakeholders of the institution or other Executive Agencies (19
participants in total) towards the end of the evaluation. In addition to these stakeholders’
consultation activities, our study relied on an extensive documentary review and desk
research, an analysis of the survey and administrative data, and a retrospective cost–
benefit analysis. Further details on the methodology and progress achieved during the
study are presented in Annex 1.

3 Background: REA’s Regulatory Framework, Mission and Governance


System

3.1 The regulatory framework of setting up the Executive Agencies

The Executive Agencies were established following the reform of the Commission that
took place in 2000. To refocus its resources on its core functions, the Commission
created several Executive Agencies whose main objective was to implement EU
programmes and whose mandate was usually limited to a specified time frame.

Council Regulation (EC) No 58/2003 of 19 December 20021 established a clear


institutional framework for the Executive Agencies. According to this Regulation, the
Executive Agencies perform managerial tasks with no discretionary power to make
political choices, leaving all the policy roles to the Commission. The Regulation also laid
out the statute for all the Executive Agencies, regulating their tasks, structure, operation,
budget system, staff, supervision and responsibility. The Regulation stipulates that to
successfully achieve their main objective – implementation of one or more EU
programmes – the following key tasks should be entrusted to the Executive Agencies:

 Implementing some or all of the phases in the programme life-cycle, in


relation to specific individual projects, in the context of implementing a Community
programme and carrying out the necessary checks to that end, by adopting the
relevant decisions using the powers delegated to them by the Commission;

 Adopting the instruments of budget implementation for the revenue and


expenditure and carrying out all activities required to implement a Community
programme on the basis of the powers delegated by the Commission, in particular
activities linked to the awarding of contracts and grants;

 Gathering, analysing and transmitting to the Commission all the information


needed to guide the implementation of a Community programme.

The Executive Agencies are a specific type of EU Agency with their own legal personality
and some autonomy, but they operate according to the rules determined by the
Commission and are supervised by their parent DGs. The fundamental features of these
Agencies are autonomy and dependence. Agency governance is established in the legal
framework applicable to the Executive Agencies and the guidelines for the establishment
and operation of Executive Agencies financed by the general budget of the EU 2.

According to Council Regulation (EC) No 58/2003, an external evaluation of an


Executive Agency needs to be carried out every three years and submitted to the

1
Council Regulation (EC) No 58/2003 of 19 December 2002 laying down the statute for Executive Agencies to
be entrusted with certain tasks in the management of Community programmes (2002). http://eur-
<lex.europa.eu/LexUriServ/LexUriServ.do?uri=OJ:L:2003:011:0001:0008:EN:PDF>
2
Commission Decision of 2.12.2014 establishing guidelines for the establishment and operation of Executive
Agencies financed by the general budget of the Union (2014). <
http://ec.europa.eu/transparency/regdoc/rep/3/2014/EN/3-2014-9109-EN-F1-1.PDF>
14
Steering Committee of the Executive Agency, to the European Parliament, to the Council
and to the European Court of Auditors. This evaluation should include a cost–benefit
analysis3.

3.2 REA history, mission and governance

REA was established by Commission Decision No 2008/46/EC 4 following a


recommendation by the Regulatory Committee on Executive Agencies and the Budgetary
Authority to establish a separate entity to implement parts of the FP7 Specific
Programmes People, Capacities and Cooperation. At the time of creation, the Agency’s
tasks were the following:

 Implementation of parts of the specific programmes listed above;

 Execution of the relevant parts of the budget;

 Collection, analysis and dissemination of information to guide programme


implementation and evaluation;

 Provision of logistical and administrative support to the specific programmes listed


above, particularly in the areas of call publication, proposal reception and evaluation,
contracting of evaluators, preparation of evaluator payments, participant validation
services and preparation of financial viability checking.

REA’s mandate was formally extended by Commission Decision 2013/778/EU5, repealing


the previous act of establishment 2008/46/EC. According to the Decision, REA continued
with the implementation of the remaining FP7 projects but also implemented large parts
of Horizon 2020 and provided programme-related administrative and logistical support
services to other Commission services. REA continued to implement all the FP7 actions
delegated to the Agency under its first mandate as well as their successor actions under
Horizon 2020, namely the specific objective “Marie Skłodowska-Curie actions” (under the
'Excellent Science' pillar), Space research (under the 'Industrial leadership' pillar) and
Security research (under the 'Societal challenges' pillar, Societal challenge 7: Secure
Societies - Protecting Freedom and Security of Europe and Its Citizens). The Agency also
continued to implement the ‘Research for SMEs' and 'SME associations' legacy actions
from the FP7 'Capacities' specific programme.

Compared to FP7, REA’s mandate under Horizon 2020 was expanded to cover
additional activities and objectives from H2020, such as FET-Open from Pillar 1, and,
from Pillar 3, most activities under the specific objective 2: “Food security, sustainable
agriculture and forestry, marine and maritime and inland water research, and the bio-
economy;” as well as the specific objective 6: “Europe in a changing world – Inclusive,
innovative and reflective societies.” In the specific objective 7: “Secure societies –
Protecting freedom and security of Europe and its citizens.” REA’s mandate was updated
to cover also the activities of digital security research. At the beginning of 2015, REA’s
mandate was adjusted with the change in the legal reference for the "Science with and
for Society" (SwafS) and "Spreading Excellence, Widening Participation" (SEWP) specific
objectives. In 2017, REA prepared for an additional extension of its mandate, entering
into force on 1 January 2018, to extend participant validation services to all direct
management operations of the Commission services and other EU bodies (linked to
SEDIA) and to implement projects generating EU classified information (EUCI) under

3
Article 3(1) of Council Regulation (EC) No 58/2003.
4
Commission Decision 2008/46/EC of 14 December 2007 setting up the "Research Executive Agency" for the
management of certain areas of the specific Community programmes People, Capacities and Cooperation in the
field of research in application of Council Regulation (EC) No 58/2003.
5
Commission Implementing Decision 2013/778/EU of 13 December 2013 establishing the Research Executive
Agency and repealing Decision 2008/46/EC.
15
Societal Challenge 7. With these additional delegated activities, REA was mandated to
implement approximately 20 % of the H2020 budget.

Under H2020, REA was also mandated to provide administrative and logistical
support services to all entities involved in H2020 management and the provision of
participant validation services, not only for H2020 but also to the programmes for
health/consumer protection, competitiveness and innovation, as well as education,
culture and citizenship: Erasmus+, Creative Europe, Europe for Citizenship, EU Aid
Volunteers, COSME, Health and Consumer programmes, Research Fund for Coal and
Steel, Competitiveness and Innovation Framework Programme.

The portfolio of programmes for which the Agency provided logistical and administrative
support services was extended in 2015 to include the Agri-promotion programme, the
Internal Security Fund (ISF), the Asylum, Migration and Integration Fund (AMIF), Rights,
Equality and Citizenship (REC) and Justice. As of 1 January 2018, REA was also entrusted
with the legal validation of third parties and the preparation of financial capacity
assessments for all the Commission services and Executive Agencies which implement
grants and procurements under direct management and the first level of transactions
under indirect management in the further framework of the implementation of SEDIA.
Overall, these developments demonstrate that REA is increasingly becoming a central
provider of support services in the Commission.

The mission of REA is to support the EU Research and Innovation policy by funding
high-quality research and innovation projects that generate knowledge for the benefit of
society6. First, the Agency assists the Commission in achieving the objectives of the
Research Framework Programmes and the EU strategies to foster growth by supporting
research and innovation by implementing parts of the Horizon 2020 and FP7 Framework
Programmes. Second, it delivers services to the research community by ensuring
implementation of its part of the EU funding for research and innovation and by providing
administrative and logistical support services to EU bodies implementing Horizon 2020
and selected other programmes as well as within the SEDIA framework, to participants to
grants and procurement activities for all the European Union programmes and for the
benefit of EU bodies implementing these programmes.

The main governing bodies of REA include i) the Steering Committee and ii) the
Director. The Steering Committee, composed of parent DGs’ and the EC’s horizontal
services’ (DG RTD; DG GROW; DG EAC; DG CNECT; DG AGRI; DG HOME and DG HR
which participates as an observer) representatives, is responsible for adopting the
Agency's annual work programme, adopting the administrative budget, deciding on the
organisation and structure of the Agency and endorsing the Director’s Annual Activity
Report together with financial and management information. The Director is appointed by
the Commission for a term of four years and has authority over the Agency staff. The
responsibilities of the Director include representing the Agency and its management,
preparing the work of the Steering Committee, preparing and publishing the annual
reports on the Agency‘s activities, implementing the Agency's administrative budget and
setting up management and internal control systems. Marc Tachelet was appointed as
the new Director of the Agency in April 2017; he took over from Gilbert Gascard who had
served as the Director since 2012.

6
REA 2018 Annual Work Programme, p. 7.
16
Steering Committee
Composed of the representatives of:
Director DG RTD; DG GROW; DG EAC; DG
Marc Tachelet CNECT; DG AGRI; DG HOME; (DG
HR participates as an observer)

REA

Figure 3. The governance structure of REA and its parent DGs.


Source: compiled by PPMI.

Although REA is a separate legal entity, it is supervised and monitored by the


Commission. This is done via the Steering Committee meetings as well as several other
mechanisms outlined in the Memorandum of Understanding between REA and its parent
DGs7. These include the parent DG’s management meetings, which are attended by the
Agency Director (weekly for DG RTD and on an ad hoc basis for the other parent DGs),
and dedicated meetings related to implementation of the programmes. These meetings
help in maintaining close working relations with parent DGs (‘mirror units’), Commission
services and offices, and other Executive Agencies. Coordination between the relevant
units of REA and their counterparts at the parent DGs occurs daily to ensure a continuous
exchange of information in the implementation of the H2020 and FP7 legacy
programmes. In accordance with REA's Delegation Act, the Agency reports regularly to
the parent DGs and to the Steering Committee on the progress achieved in implementing
the programmes (e.g. through the interim report to the parent DGs and the Steering
Committee, the Annual Activity Report and ad hoc provision of data and information). In
addition, officials seconded by the Commission to REA are entrusted with positions of
responsibility and are characterised by their twofold statutory link to both the
Commission (as seconded officials) and the Agency (as temporary agents). As a result,
they play a particular part in the transfer of know-how between the Commission and the
Agency, which is particularly important in the phasing-in period of newly delegated
programmes.

The Agency’s current organisational structure mirrors the programmes under REA’s
responsibility and consists of three departments and 14 units:

 Department A ‘Excellent Science’ consists of five Units responsible for the


implementation of the bottom-up programmes 'Marie Skłodowska-Curie Actions' and
Future and Emerging Technologies – by fostering novel ideas 'FET-Open'.

 Department B ‘Industrial leadership and Societal Challenges’ consists of five Units


responsible for the implementation of the top-down programmes Space Research,
Sustainable Resources for Food Security and Growth, Inclusive, Innovative and
Reflective Societies, Safeguarding Secure Society, Spreading Excellence, Widening
participation, Science with and for Society.

7
Memorandum of Understanding between the Research Executive Agency and DG Research and Innovation, DG
Education and Culture, DG Communication Networks, Content and Technology, DG Agriculture and Rural
Development, DG Internal Market, Industry, Entrepreneurship and SMEs, DG Migration and Home Affairs –
Modalities and Procedures of Interaction, Document Final Version dated 30/11/2015.
17
 Department C consists of four Units responsible for Administration, Finance,
Participant validation and Expert Management and Support. It also hosts the Legal
Affairs, Internal Control and Reporting capability.

Although no major organisational changes have occurred in the Departments working


with programme implementation (A and B) since 2015, two units in Department C
underwent restructuring. Two sectors under Unit C3 were divided into three: Legal
Validation of Participants, Financial Validation of Participants and Legal and Financial
verification. Unit C4 was also restructured from two sectors to three: Expert contracting,
Expert payments and Evaluation Support. These changes in the Units reflect the goal to
reach a higher operational efficiency, as well as to optimise and streamline day-to-day
procedures.

In accordance with Article 18 of Council Regulation8 (EC) No 58/2003, the Executive


Agency staff consists of Community officials employed as temporary staff members by
the Commission (in practice Commission officials seconded in the interest of the service
to the Agency) to positions of responsibility in the Executive Agency, temporary staff
members directly recruited by the Executive Agency, and contract staff recruited by the
Executive Agency. At the end of 2017, REA employed 693 staff. Please refer to Figure 47
in Annex 5 for the organisational chart of REA.

3.3 Overview of REA’s performance during 2015-2018

Even though REA had already reached an advanced level of organisational maturity by
the end of FP7, the first years of H2020 brought new challenges. The commitment to
simplification measures was an integral part of the Agency’s activities in 2014-2015. The
new simplification measures brought substantial changes to REA’s internal processes and
procedures and allowed it to absorb the increases in the workload levels. For more details
on the changes and challenges experienced by the Agency as well as an assessment on
simplifications introduced to address them refer to sections 4.1 and 4.2.

In comparison to this period, the years of 2016-2017 were relatively stable and there
was consolidation of activities. According to the information provided in its Annual
Activity Reports, the Agency continued consolidating its project-monitoring process,
including the implementation of a proportional ex ante control striking a balance between
trust and control by focusing on the identification and mitigation of risks.
Administratively, recruitment of new staff and management of its growth/size/vacancy
rate was a key priority for REA, with the newly functional inter-Agency labour market
opening new possibilities, but also risks, for the Agency. In line with the
recommendations made in the second external evaluation of REA (2012-2015), the
Agency devised an action plan and implementation measures for 2017 and subsequent
years. Another key line of activity for REA was to further improve the provision of policy
feedback to its parent DGs, as well as to improve its dissemination and exploitation
activities.

The data provided in Table 2 shows that the Agency reached cruising speed and
performed similar numbers of operations in 2016-2017 compared to the 2014-2015
period. In most key areas of work (i.e. number of calls launched, number of proposals
evaluated, grant agreements signed), the Agency experienced stable workload levels. It
should be noted that although the number of proposals/grants signed did not change
markedly in 2016 and 2017, the total stock of workload continued to increase due to the
operational budget allocated to relevant H2020 programmes and actions, the average
grant size and the corresponding number of grants as discussed in more detail in section

8
Framework Regulation for executive agencies: Council Regulation (EC) No 58/2003 of 19 December 2002
laying down the statute for executive agencies to be entrusted with certain tasks in the management of
Community programmes.

18
4.2.4. In addition, the extension of the Agency’s portfolio of activities with respect to its
administrative and logistical support activities also increased REA’s workload level.

Table 2. REA‘s evolution of activities from 2012 to mid-2018.

Year Number of calls Number of proposals Number of grant


launched evaluated agreements signed

2012 16 10,846 2,180

2013 15 14,068 2,157

2014 25 11,473 1,797

2015 29 14,639 1,918

2016 22 12,839 1,713

2017 35 14,783 1,967

2018* 30 14,017 401

Note: a number of evaluations were still ongoing at the time of the preparation of this report in January 2019, thus the data
presented for 2018 is not complete.
Source: overview of REA’s Annual Activity Reports, annual work plans and REA’s administrative data (as of 3 January 2019).

During 2012-2016, REA-managed payment appropriations of some EUR 1.1-1.7 billion.


The administrative budget of the Agency was in the range of EUR 33.4-43.9 million
without central support services (in payments) during this period. Therefore, the cost-
effectiveness of controls in REA, which was estimated as the percentage of operating
costs over the operational budget in terms of payments executed, reached around 2.6 %
in 2016 and 2017 (see Table 3 below).

Table 3. Operational and administrative budgets of REA with the cost-effectiveness of


controls from 2012 to 2017.

Programme 2012 2013 2014 2015 2016 2017

Operational budget (million 1,479.87 1,393.98 1,096.88 1,418,48 1,642.94 1,697.18


EUR), payments

Administrative budget (million 38.96 41.67 33.39 39.88 42.42 43.92


EUR), without central support
services, payments*

Percentage of operating costs 2.6 % 3.0 % 3.0 2.8 % 2.6 % 2.58 %


over the operational budget in
terms of payments executed

Source: overview of REA’s Annual Activity Reports and annual work plans.

The stock of running projects grew from 6,434 projects in 2012 to 6,925 projects in
2014. At the end of 2016, REA was managing 6,658 grants (see Figure 4 below) of which
3,815 were H2020 grants and the remaining projects were legacy actions from FP7 (see
Table 4). This shows that the legacy actions implemented by the Agency still constituted
a substantial part of the Agency’s workload during the evaluation period.
19
Table 4. Breakdown of REA’s running projects, from 2015 to 2018.

Programme 2015 2016 2017 2018

H2020 2,190 3,815 4,984 5,804

FP7 4,830 2,843 1,436 418

Total 7,020 6,658 6,420 6,222

Source: compiled by PPMI based on AARs and REA’s administrative data (as of 3 January 2019).

With the change from FP7 to H2020, more top-down programmes were delegated to
REA. Due to the higher than expected number of new H2020 grants, the increase in the
workload for project monitoring was proportionally higher than the overall increase in the
portfolio of running projects. Although the workload associated with administering FP7
projects gradually decreased. Overall, the stock of projects decreased slightly from 7,020
running projects in 2015 to just over 6,220 projects in 2018. To accommodate the
workload related to its activities, the Agency grew from 449 positions in 2014 to 595
positions in 20189.

7200
7129
7000 7020
6925
6800
6658
6600
6400 6434 6420
6222
6200
6000
5800
5600
2012 2013 2014 2015 2016 2017 2018

Figure 4. REA‘s portfolio of running projects, from 2012 to mid-2018.


Source: overview of REA’s Annual Activity Reports, annual work plans and REA’s administrative data (as of 3 January 2019).

Management and provision of administrative and logistical support activities

Under H2020 REA became a key provider of support services for the H2020/Research
family and a variety of other programmes described in the previous section of the report.
The Agency provided centralised support for the activities outlined in the assessment of
management and provision of central support services included in Annex 5. In addition,
REA held the role of Business Process Owner (BPO) for expert payment and contracting
and coordinated the development of its Expert Management in the Participant Portal
(EMPP) and Expert Management Internal (EMI) IT tools, as well as BPO for proposal
management (submission and evaluation); all key business processes for the
implementation of Horizon 2020. The Business Process Owners and Managers for
participant management, which includes control over the PDM/Beneficiary Register
applications, were also based in REA.

9
The actual number of operational staff based on REA’s administrative data (as of 3 January 2019).
20
The management of administrative and logistical support services constituted a very
sizeable part of the Agency’s activities during 2016-2018. The total costs of the support
services represented around 30 % of the operating costs of REA in 2016-2017. Table 5
shows the evolution of key central support services provided by the Agency between
2014 and the first six months of 2018. Overall, the data indicate relatively stable changes
in REA’s workload levels, and especially between 2014-2017, with some increases in the
number of expert contracts signed and payments made. On the other hand, REA
supervised fewer experts on site in 2017 (i.e. 7,000 experts) compared to the previous
years. This evolution is explained by the Agency’s strategic shift to fewer on site and
more remote evaluations compared to the early years of H2020, when the number of
experts supervised on site reached its peak (i.e. some 20,000 experts were managed on
site in 2014).

Table 5. Overview of key Central Support Services provided by REA between 2014 and
mid-2018.

2014 2015 2016 2017 2018


outputs outputs outputs outputs outputs
(Jan–Jun)

Supervision of Around Around 8,800 6,730 experts 7,000 experts Around 3,350
evaluation 20,200 experts supervised on supervised on experts
activities and experts supervised on site, around site, 15,700 supervised on
management of supervised on site, 11,000 13,100 expert expert site, 7,400
independent site and expert contracts contracts expert
experts remotely, contracts signed signed contracts
some 11,400 signed signed
expert
contracts
were signed

Provision of 10,600 Around Around 21,700 12,000


payments to payments 19,000 21,000 payments to payments to
expert evaluators made to payments to payments to experts made experts made
expert experts made experts made
evaluators

Management of 13,000 10,700 Nearly 11,000 7,900 4,100


the Research enquiries enquiries enquiries enquiries enquiries
Enquiry Service received received received received received
(RES)

Provision of 1,600 cases 2,000 cases Over 1,850 Over 1,300 860 cases
support for handled handled cases handled cases handled handled
Financial Viability
Checks (FVCs)/
Financial Capacity
Assessments
(FCA) (since
2017)

Provision of REA’s 5,923 legal 7,400 entities 7,320 entities 7,800 entities Around 3,000
Validation entities validated + validated + validated + entities
Services participating in 10,450 LEAR 7,700 LEAR 8,900 LEAR validated +
research extended extended extended 4,300 LEAR
projects mandates mandates mandates extended
validated + validated validated validated mandates
11,390 LEAR
validated
Extended
Mandates
granted

Note: starting from 2017 the reporting methodology on FCA changed – only cases on which FCA was actually done were
reported. Source: compiled by PPMI.

21
During 2016-2018, the Agency continued its cooperation with the Common Support
Centre (CSC) to optimise the delivery of support services and to harmonise the
implementation of H2020. For example, in 2017 the two bodies continued the work on
the business processes for expert management and further development of IT tools,
aiming at the best allocation of proposals to experts and improved detection of conflict
of interest. The Agency undertook other developments during the evaluation period of
this study, notably:

 The Agency was exploring further efficiency gains for Legal Entity Authorised
Representatives (LEAR) validations and reviewing the related business processes.

 The update of the “Rules for Administrative and Logistics Support Services” and the
further harmonisation of the “Rules for legal and financial viability checks” were
initiated in 2016. The update process ended in 2018 with the adoption of the “Rules
for the validation support services provided by the Research Executive Agency for EU
grant and procurement procedures based on e-Grant and eProcurement Corporate
Information Systems in the context of the Single Electronic Data Interchange Area
(SEDIA)” (endorsed by the eGrants and eProcurement Steering Board (GPSB) on
10/10/2018) and the “EU Funding & Tenders – Rules on legal entity validation, LEAR
appointment and financial capacity assessment” (endorsed by the GPSB on the
25/01/2018).

 A new model contract for experts was implemented in April 2017.

 REA has been playing a major role in the development and roll-out of SEDIA, building
on REA's participant validation activities supported by the Participant Data
Management (PDM) tool. The implementation of SEDIA started at the beginning of
2018 and there are plans for the integration of several IT tools to become a
standardised electronic exchange system for procurement and grant management for
the Commission. For more details on the assessment of management and provision of
central support services (including SEDIA), refer to Annex 5.

These and other key changes to REA‘s provision of administrative and logistical support
activities were expected to produce efficiency gains and/or lead to better quality services.
The assessment of the effectiveness and efficiency of these measures was a key task in
this assignment.

4 Evaluation Results: the Assessment of REA’s Performance in 2015-2018

4.1 Effectiveness, efficiency and coherence

4.1.1 Effectiveness

In assessing the effectiveness of REA’s performance in 2015-2018, the evaluation team


first reviewed the actual operations of REA and to what extent they were in line with the
legal framework establishing the Agency. Evidence was documented via desk research,
analysis of the relevant legal (Commission Decisions, Delegation Acts) and operational
(AWPs and programmes, AARs and the MoU between the Agency and the parent DGs)
documents. In addition, the findings from the interviews with EC and REA officials,
surveys of EC officials, beneficiaries, unsuccessful applicants and external experts,
previous CBA studies as well as staff satisfaction surveys informed the evaluation.

The following two documents introduced the key changes to the legal framework of REA,
which were most relevant during the evaluation period:

‒ Commission Decision of 12.12.2014 (Decision C(2014) 9450) amending


Commission Decision C(2013) 9418 on delegating powers to the Research
Executive Agency with a view to performance of tasks linked to the implementation
22
of Union programmes in the field of research and innovation comprising, in
particular, implementation of appropriations entered in the general budget of the
Union;
‒ Commission Decision of 14.7.2017 (Decision C(2017) 4900) amending Commission
Decision C(2013) 9418 regarding the delegation of tasks for the setting-up of a
Single Electronic Data Interchange Area, the transfer of human resources in line
with a redistribution of tasks and the delegation to the Research Executive Agency
of projects generating EU classified information (entered into force on
1 January 2018).

The adoption of the Multiannual Financial Framework (MFF) 2014-202010, Commission


Decision 2013/778/EU11 (the Establishment Act) as well as Commission Decision C(2013)
941812 (the Delegation Act) have also significantly contributed to the establishment of
the Agency’s legal framework which was in place during the period of 2015-2018. The
changes introduced through these documents were examined in the previous
evaluation13, hence this report does not assess them in detail.

Our overall approach to Task 2.1 focused on assessing the link between the Agency’s
mandate/responsibilities and its external governance framework, defined by the key
developments outlined above, and the key enablers allowing REA to perform its tasks
according to the work programmes and needs of relevant stakeholders. Below we present
our findings for the effectiveness part of the study.

To what extent is REA operating according to the legal framework establishing


it?

Based on the analysis of desk research and interview data, the overall result of our
evaluation for this question is that the Agency respected its legal framework in
2015-2018. Despite an increase in the scope of its activities, the Agency successfully
accommodated the handover of the new activities following the extensive preparation
process. As a result, REA supported the Commission services by effectively contributing
to the objectives pursued by the new programmes and activities delegated to the Agency
in 2014 and 2017. REA’s activities in 2015-2018, as planned in the Annual Work
Programmes and reported in the Annual Activity Reports, corresponded to the tasks set
out in the relevant Commission Decision and the Delegation Act.

REA’s mandate was adjusted with the Commission Decision C(2014) 9450, which
came into effect at the beginning of 2015. A technical change was introduced by the new
Commission Decision in the legal reference for the "Science with and for Society" (SwafS)
and "Spreading Excellence, Widening Participation" (SEWP) specific objectives. The
Commission Decision also triggered changes in the governance of REA as a new parent
DG, namely the Directorate-General for Migration and Home Affairs (DG HOME), joined
the Steering Committee and REA, in line with practices across the Commission, abolished
its internal audit capability by transferring all internal audit competencies to IAS.

10
Council Regulation (EU, Euratom) No 1311/2013 of 2 December 2013 laying down the multi-annual financial
framework for the years 2014-2020 (2013). OJL 347, p. 884. < https://eur-lex.europa.eu/legal-
content/EN/TXT/?uri=CELEX %3A32013R1311>
11
Commission Implementing Decision of 13 December 2013 establishing the Research Executive Agency and
repealing Decision 2008/46/EC: extension of the mandate until 2024 in 2014 (entered into force on 1 January
2015).
12
Commission Decision C(2013) 9418 of 20 December 2013 on delegating powers to the Research Executive
Agency with a view to the performance of tasks linked to the implementation of Union programmes in the field
of research and innovation comprising, in particular, implementation of appropriations entered in the general
budget of the Union, as amended by Commission Decision C(2014) 9450, Commission Decision C(2015)8754
and C(2017)4900.
13
PPMI (2016). Evaluation of the Operation of REA (2012-2015). Final Report.
23
Another major change introduced to the Agency’s legal framework during the evaluation
period was the adoption of Commission Decision C(2017) 4900 in July 2017 which
further updated REA’s mandate. Building on the Agency’s experience in providing
participant validation activities for H2020, REA was entrusted with the legal validation of
third parties and the preparation of financial capacity assessments for all awarded grants
and procurements under direct management as well as the first level of transactions
under indirect management in the further framework of the implementation of the Single
Electronic Data Interchange Area (SEDIA). In addition, the Agency was mandated to
implement projects generating EU classified information for Societal Challenge 7.

REA devoted significant efforts to optimise its processes for legal and financial validation
of participants in view of the establishment of a standardised electronic exchange system
for procurement and grant implementation (through the integration of several IT tools)
serving the Commission as a whole. This optimisation represented a considerable
challenge to the Agency during 2017-2018 in terms of workload. In order to evaluate
REA’s preparedness for the provision of legal and financial validation of participants, a
formal readiness assessment14 took place at the end of 2017. The assessment found that
the governance arrangements and the organisational structure, as well as the business
processes and the procedures put in place by REA are fully adequate to ensure the
delivery of the newly delegated SEDIA service. In terms of the supporting IT tool, which
was supplied by DG DIGIT, however, there remained scope for further increases in
efficiency. As a result, REA began the timely delivery of corporate validation services on
1 January 2018 when Commission Decision C(2017) 4900 came into effect. During 2018,
the participant validation activity was growing progressively with the integration of new
clients from both grant and procurement domains. The on-boarding process will continue
in 2019 until the full set of clients starts using the corporate solution.

The increase in the scope of its activities during the evaluation period required REA to
cooperate closely with its parent DGs in order to effectively assist them in the
implementation of delegated programmes and the achievement of its policy objectives.
The modalities and procedures of interaction between the Agency and its parent DGs
were set out in the Memorandum of Understanding (MoU) 15 between REA and its parent
DGs in 2016. The new MoU replaced the previous MoU version signed on
4 September 2014 taking into account the changes made to its mandate with Decision
C(2013)9418 and Decision C(2014)9450 as well as subsequent adjustments in REA’s
governance structure.

The six parent DGs of REA, namely DG Research and Innovation, DG Education, Youth,
Sport and Culture, DG Communication Networks, Content and Technology, DG
Agriculture and Rural Development, DG Internal Market, Industry, Entrepreneurship and
SMEs, DG Migration and Home Affairs, set out a supervision strategy aimed at avoiding
gaps or duplication of efforts resulting from crossover between their monitoring and
supervision tasks and the execution tasks of the Agency in the MoU. Specific provisions
of the MoU are analysed in greater detail in subsequent questions in the effectiveness
and coherence parts (please refer to section 4.1.3) of the study relating to policy
coherence.

Operational question: was the operation of REA flexible enough to


accommodate key changes (esp. those induced by the extended mandate), at
the same time maintaining concordance with the legal framework establishing
the Agency?

14
IAS (2018). Audit on the REA’s preparedness to deliver SEDIA-related services. Closing Note.
14
2016 European Commission Staff Survey: Analysis of the findings.
15
Memorandum of Understanding between the Research Executive Agency and DG Research and Innovation,
DG Education and Culture, DG Communication Networks, Content and Technology, DG Agriculture and Rural
Development, DG Internal Market, Industry, Entrepreneurship and SMEs, DG Migration and Home Affairs –
Modalities and Procedures of Interaction, Document Final Version dated 30/11/2015.
24
The overall findings are that REA’s operation was flexible enough to accommodate key
changes that were effectively and smoothly introduced (particularly in relation to the
Agency’s extended mandate in 2014 and 2017). At the same time, the Agency operated
according to the legal framework establishing it. Although the Agency experienced
substantial challenges, most of the interviewed Agency and DG officials reported that the
Agency managed its tasks well. Below we present the key changes that affected REA’s
performance between 2015 and 2018 and the actions taken by the Agency to flexibly
adjust its internal operations and procedures in accordance with the emerging challenges
and needs.

Key change 1: extension of the Agency’s mandate in 2014, leading to the


adoption of a new organisational structure and HR strategy

The organisational structure of the Agency was redefined in 2014 in order to prepare the
Agency for the implementation of its new mandate. In the context of the adoption of the
Horizon 2020 framework and the reorganisation of the Commission in 2014, REA’s
mandate was updated and extended until 2024. The Agency started to fully implement its
new mandate with an increased portfolio of activities (with the addition of FET-Open,
SC2, SC6, SC7-cyber security, SwafS and SEWP programme) and a wider range of
administrative and logistical support services, extended to additional clients in 2015 as
discussed in section 3.2. REA also adopted various workload management measures as
well as a new HR strategy during the evaluation period to accommodate these changes.
For a detailed assessment of other measures and simplifications developed and the key
success stories and lessons learned, refer to section 4.1.2 as well as Annex 5.

Key change 2: delegation of a specific role in SEDIA to REA and its


implementation in 2017-2018

As mentioned in the question above, in 2017 REA was delegated a major role in the
development and roll-out of SEDIA. Given the Agency’s extensive experience in providing
validation services for the Research family and other EU programmes, REA started the
implementation of SEDIA as planned. Thus, as of January 2018, REA handles validations
not only for Horizon 2020 and a handful of other programmes but also for all the
Commission and other Executive Agencies’ grant and procurement activities under direct
management and first level of indirect management. This one-stop shop for legal and
financial data of participants provides an important simplification for SEDIA clients
(including applicants, candidates and tenderers). Although the volume of transactions did
not reach the CBA estimate in the first semester of 2018, REA carried out important work
in facilitating the on-boarding of new clients in preparation for the significantly higher
workload during the second semester16. A number of organisational, technical and legal
aspects of the project, however, presented challenges to REA. For more details on the
implementation of SEDIA, please refer to Annex 5.

Key change 3: introduction of remote evaluations

Another key change that affected the Agency during the evaluation period began with
REA’s decision to extend the use of remote evaluations in 2015. The steadily rising
number of proposals submitted and the space limitation for large calls forced REA to
introduce fully remote evaluation procedures for calls with high numbers of proposals and
to reserve the central evaluations in Brussels for only the most difficult and complex
cases and for panel meetings in 201617. To accommodate this gradual change from
mainly on site to largely remote evaluations, REA introduced a number of improvements
such as the development of short animated videos for expert briefings and facilities for
teleconferences. Please refer to 4.1.2 section for a detailed analysis of the evaluation

16
REA 2018 Interim Report, p. 37.
17
REA 2016 Annual Activity Report, p. 87.
25
process as well as Annex 5 for the assessment of the Agency’s key success stories and
lessons learned.

Another area where a major change occurred during the evaluation period related to the
development of a policy feedback strategy. The Agency’s efforts to undertake necessary
steps in this area are assessed in the subsequent question within this section as well as
in section 4.1.3.

Operational question: does the existing legal framework ensure policy


coherence and unified communication between the Agency and its parent DGs,
while at the same time avoiding "micro-management"?

The existing legal framework aims to ensure policy coherence and unified communication
and collaboration between REA and its parent DGs. As previously mentioned, the
modalities and procedures of interaction between the Agency and its parent DGs were set
out in the Memorandum of Understanding between REA and its six parent DGs in 2016.
The legal framework and Memorandum of Understanding set out flexible provisions to
ensure overall policy coherence and communication between REA and its parent DGs,
while ensuring that no micro-management was present. The provisions have generally
worked well and both the parent DGs and REA appreciated the effectiveness of the
collaboration.

Below we present the analysis of the overall coherence and communication between REA
and its parent DGs. In addition, this section presents some examples of the areas where
collaboration and communication between REA and its parent DGs could be further
improved, including:

‒ Communication between REA and parent DGs on the selection of independent


experts, validation of expert lists and selection of proposals;
‒ Participation of EC officials in project-monitoring activities.

Overall coherence and communication between REA and its parent DGs

REA and its mirror units have relied on the existing legal framework to ensure coherent
policy and unified external communication in various project life-cycle phases during the
evaluation period. According to the interviewed EC and REA officials, the existing
communication mechanisms and other collaboration tools ensured that the collaboration
did not involve any micro-management by the Agency’s parent DGs (as no cases of
micro-management were mentioned). The respondents have also confirmed that REA
closely collaborated with the Commission by providing technical/specialised contributions
in relation to strategic planning for Horizon Europe, drafting of Horizon Europe indicators
and the specific programmes for Horizon Europe. In addition, REA supported programme
design and implementation by closely collaborating with all other members of the R&I
Family. This was also reflected in the results of Survey C as nearly 84 % of the surveyed
EC officials agreed that REA effectively and efficiently implemented the programmes
delegated to it under their portfolio of activities (refer to Figure 5).

In terms of the balance between policymaking and programme implementation tasks


carried out by the Commission and the Agency, around 58 % of EC officials, who
responded to our survey, stated that REA enabled them to focus entirely on their
policymaking tasks. Over a third of the respondents felt that they were (to some extent)
also involved in activities delegated to REA which could be regarded as programme
implementation tasks. It is noteworthy, however, that a certain level of the policy
officials’ involvement in the programme implementation tasks, such as participation as an
observer in the evaluation consensus and panel meetings, involvement in the processing
of selection decisions after GAP, is an integral part of the collaboration and policy

26
feedback mechanism18. This collaboration and policy feedback mechanism, in accordance
to the MoU, relies on close collaboration between REA and policy directorates/units in the
parent DGs, which is necessary to ensure that the results from the delegated actions can
be used as input for the Commission’s policymaking.

Although the provisions of the MoU relating to the collaboration and policy feedback
mechanism have generally worked well, the evaluation identified several business
processes where an improvement was needed. The MoU specified that policy feedback is
embedded in various mechanisms such as the exchanges regarding the results of calls for
proposals and the cooperation on call-related events (e.g. info days), joint briefings of
the expert evaluators and contribution by the Agency, upon request by the parent DGs,
to the draft Work Programme19. Although the MoU set out general provisions for the
collaboration between the Agency and its parent DGs in this area 20, it also emphasised
that smooth and effective feedback to policymaking requires both sides to agree and plan
well in advance the clear allocation of responsibilities and tasks between the involved
parties at all stages of the project life-cycle21.

At the time of the evaluation the specific business processes were agreed on in one case
only (REA’s Unit B2). In addition, the overall level of awareness of the specific business
processes involved varied among the parent DGs and units. Some of the parent DGs and
REA units enjoyed a level of flexibility (granted by the MoU) in setting up unique working
practices at the unit level suiting the needs of the specific portfolio of every DG/unit
especially those with a long-standing collaboration dating back to FP7. The lack of formal
agreements, however, had implications for several parent DGs and units where the
collaboration was more recent. Interviews with EC officials from these parent DGs
revealed that there were instances where specific needs of the EC were not fully met as
they would have preferred more customised policy outputs tailored to their needs. As a
result, only 38 % of the surveyed EC officials generally agreed that the Agency provided
them with sufficient policy feedback to inform their policymaking tasks. At the same
time, there were some cases where REA’s proactive supply of information and
communication was not fully taken up by the parent DGs. For more details, refer to
section 4.1.3.

To mitigate the issues, which emerged in the absence of the specific business processes
across the Agency and in the context of the varying levels of overall awareness of the
existing process among the parent DGs, REA started to engage proactively with its mirror
units at the Commission to establish their responsibilities and tasks. Please refer to
section 4.1.3 as well as the first in-depth study area in Annex 5 for a detailed analysis of
the actions taken by REA to strengthen its inputs to policymaking. At the time of the
preparation of this report, however, the Agency was in the process of defining and
implementing business processes and policy feedback activities associated with various
areas in which REA collaborated with the parent DGs. Although some of the resulting
actions were already taking place at the time of the interview and survey programmes in
August-October 2018, it should be noted that they may not necessarily have shown their
impact yet at the level of the policy officers. This may explain the lower satisfaction
expressed by EC officials during the survey and interview programmes.

18
Memorandum of Understanding, p. 26-27.
19
Memorandum of Understanding, p. 26.
20
One notable examples of the measures to ensure effective collaboration in the area of policy feedback was
the responsibility of the parent DG to ‘contact the Agency whenever they wish to communicate on project
results in order to ensure availability of updated and reliable information’). MoU, p. 26.
21
Memorandum of Understanding, p. 26-27.
27
I have a good working relationship with my REA
counterpart(s) at interpersonal level (N=43)
91% 9%

REA effectively and efficiently implements the


programmes delegated to it under my portfolio of 84% 8% 8%
activities (N=39)

I actively use the inputs provided by REA in my


policymaking tasks (N=34)
70% 12% 18%

REA is proactive in its daily communication with me


(N=42)
59% 17% 24%

REA enables me to focus entirely on my policymaking


tasks (N=33)
58% 30% 12%

I am not involved in any activities delegated to REA


which could be regarded as programme implementation 50% 15% 35%
tasks (N=40)

REA provides me with sufficient policy feedback to


inform my policymaking tasks (N=34)
38% 27% 35%

0% 10% 20% 30% 40% 50% 60% 70% 80% 90% 100%

Strongly or rather agree Neither agree nor disagree Rather or strongly disagree

Figure 5. EC officials’ assessment of the policy coherence and unified communication between them and the Agency.
Source: survey of EC officials.

With regards to programme implementation feedback, REA effectively enabled its parent
DGs to exercise their supervisory function and build on experiences from programme
implementation when preparing work programmes. Out of nearly 60 % of the surveyed
Commission officials, who stated that they liaised with REA staff during the preparation of
the work programmes and/or research topics in their areas of responsibility during 2015-
2018, over 90 % of the respondents agreed that the inputs provided by REA were timely
and of high quality to a large or moderate extent. Around 74 % of the surveyed EC
officials thought that REA provided them with relevant inputs for the programme
priorities/research topics under their or their unit’s responsibilities. However, less than
60 % of the surveyed Commission officials noted that the inputs provided by REA were
directly used in the implementation of the preparation of the work programme or
research topics (refer to Figure 6 below).

28
The inputs provided by REA were timely (N=23) 61% 30% 9%
4%
4%
The inputs provided by REA were of high quality (N=23) 48% 44%

REA provided me with relevant inputs for the


programme priorities/research topics under my/my 39% 35% 9% 13%
unit’s responsibility (N=23)
4%
The inputs provided by REA were directly used in the
implementation of the preparation of the work 29% 29% 29% 13%
programme/research topics (N=21)
0% 10% 20% 30% 40% 50% 60% 70% 80% 90% 100%

To a large extent To a moderate extent To some extent To a little extent Not at all

Figure 6. The extent to which the surveyed EC officials agreed with the following statements regarding the inputs provided to
them or their unit by REA.
Source: survey of EC officials.

At the level of interpersonal contacts, coordination and trust has generally worked well
throughout the 2015-2018 period, ensuring that both staff from REA and EC officials
have access to essential information. Meetings at senior management level with the
parent DGs (organised at least twice a year per programme) and permanent contact at
Head of Unit and operational level allowed the Agency and parent DGs to interact
effectively during all phases of project implementation and exchange policy documents,
relevant project deliverables and other working material. In addition, bilateral meetings
took place on an ad hoc basis to discuss relevant issues. Over 90 % of the surveyed EC
officials felt that they had a good working relationship with their REA counterpart(s) at
interpersonal level. On the other hand, about 60 % of the surveyed EC officials claimed
that REA was proactive in its regular communication with them, while a quarter of the
officials thought that the Agency was not proactive enough in their regular contact with
counter parts in the relevant parent DGs.

Communication between REA and parent DGs on the selection of independent


experts, validation of expert lists and selection of proposals

REA fully complied with the evaluation procedures outlined in its legal framework and the
MoU and even introduced gradual improvements22 into the process of proposal allocation
to the suitable experts. Furthermore, REA engaged with its parent DGs during the
evaluation period to formulate business processes at the unit/parent DG level in this area
of evaluation procedures among others. While the parent DGs generally appreciated the
evaluation procedures and the steps taken by the Agency to improve the processes,
some of the working practices between REA and its mirror units at the Commission were
still affected by the lack of fully established business processes at the time of the
interview and survey programmes. The evaluation also showed that their working
practices were affected by the varying levels of awareness among different policy officials
of the specific existing business processes set out to guide their collaboration with REA.
These issues were observed in relation to the communication between REA and the
parent DGs on the selection of independent experts, validation of expert lists and
selection of proposals.

Although the specific needs of the parent DGs were not yet consistently established, it
was a common practice across REA’s units to invite the policy officials from the parent

22
For instance, when the Proposal Expert Allocation System (PEAS) was integrated into the Proposal submission
and evaluation tool (SEP) to allow an automated pre-allocation of proposals to experts, based on the best
match of their keywords and displaying the level of match as a percentage. This has advanced the procedure
for allocating proposals to experts significantly in terms of speed and quality, particularly in the calls with a high
number of submitted proposals and in the bottom-up programmes (e.g. MSCA calls).
29
DGs to participate in the briefings to experts and inform them about the progress of the
evaluations under their calls or research topics. Nevertheless, the interview and survey
programmes demonstrated that some of EC officials lacked awareness about the
provisions of the MoU23 which allowed them to participate in the briefings on policy-
relevant aspects to experts or attend panel review meetings as observers. In addition, a
substantial group of Survey C respondents (20 %) felt sufficiently informed24 about the
progress of evaluations in their programmes and/or research topics only to a little extent
or not at all (refer to Figure 7 below). To facilitate mutual direct contacts further, both
the Agency and the Commission should ensure that a list of contact persons including
areas of their responsibility (per project on the Agency side, per thematic area on the DG
side) is always maintained25. Furthermore, REA could make more efforts to provide
qualitative information on the evaluation process (e.g. on the reasons for rejecting
certain proposals under the topics with policy interest) according to some EC officials.

15%

5%

5%

53%

Yes, to a large extent

23%
Yes, to a moderate extent

Yes, to some extent

Yes, to a little extent

No, I was not sufficiently informed by REA about the progress of evaluations in my programme and/or
research topics
Figure 7. The extent to which the surveyed EC officials were sufficiently informed by REA about the progress of the evaluations
under their call or research topics.
Source: Survey of EC officials (the graph is based on 40 valid responses).

Survey C respondents were presented with a question about the extent to which REA
implemented a process which ensured that the proposals best addressing the specific
research topics under their areas of responsibility were selected for funding, as set out in
the H2020 Rules For Participation and further detailed in the Vademecum. Around 55 %
of the respondents (N=21) stated that the Agency always implemented a process which
ensured that the proposals best addressing the specific research topics under their areas
of responsibility were selected for funding. Another 37 % of the respondents (N=14)
thought that the Agency was able to ensure that the best proposals were selected for
funding most of the time or sometimes. About 8 % of EC officials (N=3) disagreed that
REA implemented a process which ensured that the proposals best addressing the
specific research topics under their areas of responsibility were selected for funding (for
more information refer to Figure 8 below). Although all EC officials recognised throughout
the interviews that REA implemented an evaluation process that was fully compliant with
the H2020 Rules for Participation, H2020 Vademecum and other relevant documents,

23
Memorandum of Understanding, p. 26.
24
Among the key measures used by REA to inform the parent DGs about the progress of evaluations in their
programmes and/or research topics were the updates about the ongoing processes prior or during the PC and
NCP meetings and briefings on the outcomes of the evaluations through the call evaluation reports and various
ad hoc analyses.
25
Memorandum of Understanding, p. 27.
30
they noted that the outcome of this process was not always satisfactory particularly with
respect to actions with policy interest.

REA aimed to ensure that the final pool of expert evaluators and observers corresponded
to the specific needs of each parent DG despite the lack of clearly formulated needs of
the parent DGs. To that end, the Agency staff proactively consulted with the parent DGs
on the pool of experts (unless the DGs chose to opt out) and took their suggestions into
account. The feedback from the interviews and results of Survey C suggest that some
policy officials at the parent DGs still thought that they were not systematically consulted
on the selected pool of experts and that their suggestions were not always taken into
account. This may be attributed to the lack of awareness about the provisions of the MoU
defining the procedures for the development of the final pool of experts, as well as the
measures which might facilitate their collaboration with REA during expert selection.

The MoU specified that the Agency is responsible for assembling a pool of expert
evaluators and observers, and sending it to the relevant parent DGs for consultation
before the approval of the final list, unless the latter requests not be consulted 26.
Moreover, REA is also responsible for adopting the final pool of expert evaluators and
observers, as well as informing REA support services even if the relevant parent DGs do
not respond to REA within the five working days with a reasoned recommendation. Thus,
the limited time given to the Agency to prepare the final pool of experts is beyond the
scope of REA. Nevertheless, a number of measures may facilitate the collaboration of the
policy officials with REA in the selection of experts despite the limited timeframe. For
instance, EC officials may suggest suitable experts to the Agency well in advance once
the topics in the WP are drafted. REA conducts the consultation on the pool of experts for
most sub-programmes in two steps, including at operational level early on to ensure that
the inputs of the policy officials are taken into account from the outset, and a formal
consultation as well as sign-off at a later stage when finalising the pool. In addition, the
policy officials from the parent DGs may encourage scientists/researchers participating in
international conferences/events to apply to become an expert. Interviews with both the
Agency staff and EC officials revealed, however, that the parent DGs did not exploit these
measures to contribute to the process of expert selection effectively due to the lack of
awareness about them as well as the procedures outlined in the MoU.

26
Memorandum of Understanding, p. 19.
31
8%
8%

55%
Yes, always 29%

Yes, most of the time

Yes, sometimes

No, REA overall did not implement a process which ensures that the proposals best addressing the specific research
topics under your area of responsibility were selected for funding, as defined in the H2020 Rules For Participation and in
the Vademecum

Figure 8. EC officials’ assessment of the extent to which REA implemented a process which ensured that the proposals best
addressing the specific research topics under their areas of responsibility were selected for funding, as set out in the H2020
Rules For Participation and further detailed in the Vademecum.
Source: Survey of EC officials (the graph is based on 38 valid responses).

Participation of EC officials in project-monitoring activities

The Agency compiled and provided the parent DGs with up-to-date information on
project main events including regular project and review meetings through the shared
calendars in accordance with the provisions of the MoU 27. While project events were less
relevant for the policy officials working with the bottom-up programmes, such as MSCA,
SEWP and FET-Open, they were highly relevant for those who worked with the top-down
programmes. Review/mid-term meetings and clustered review meetings were mentioned
as key mechanisms through which EC officials learned about the progress made in the
relevant portfolio of projects based on the Survey C results. Due to the varying needs of
the parent DGs concerning the participation of policy officials in project-monitoring
activities, REA took proactive steps to agree on the specific business processes in this
area with each of its parent DGs during the evaluation period. Despite these actions, the
evaluation still observed some issues relating to the feedback on the progress of ongoing
projects and early results/lessons learned for the policy implementation process.

According to the survey of EC officials, REA invited around 50 % of the respondents to


attend the project review/mid-term meetings for all or virtually all projects funded under
their programmes or research topics always or most of the time. However, around 17 %
of the surveyed EC officials stated that they were generally not invited to attend any of
the project review/mid-term meetings that they were interested in. In addition, almost
35 % of the respondents reported that they were invited to such meetings only
sometimes or occasionally (refer to Figure 9 below). These results demonstrate a need to
further consolidate and streamline the business processes that REA has been tailoring to
the specific needs of the parent DGs with a particular attention to the participation of EC
officials in project-monitoring activities.

27
Memorandum of Understanding, p. 27.
32
40%
35%
30%
25%
20%
15%
10%
5%
0%
Yes, always (i.e. for Yes, most of the time (N=6) Yes, sometimes (N=10) No, REA generally did not
all/virtually all projects invite me to attend any
funded under my project review meetings
programme or research that I was interested in
topics (N=8) (N=5)

Figure 9. The level of satisfaction with the frequency of invitations to attend the project review meetings for projects funded
under their programme or research topics during 2015-2018 among the surveyed EC officials.
Source: Survey of EC officials (the graph is based on 29 valid responses).

To what extent has REA achieved its objectives with special focus to (a) the
implementation of the delegated programmes Horizon 2020 and FP7, (b) the
implementation of the support services and (c) the implementation of the
internal control principles, notably sound financial and human resource
management? What, if anything, could be done to render REA more effective in
achieving these objectives?

REA’s 2015-2017 AARs and 2018 Interim Report confirmed that the Agency has
achieved its objectives as set out in the AWPs between 2015 and the first semester of
2018. Its activities and priorities aligned to the broader policy context and objectives of
parent DGs. In addition, the Agency was responsive to changing policy contexts as
REA has demonstrated an effective response in the areas where policy developments
have necessitated significant additional actions. The evaluation concludes, however, that
further improvements could be made with regard to the lessons learned during 2015-
2018 about the growing prevalence of actions with policy interest.

REA’s mission statement indicates that the Agency assists the Commission in achieving
its objectives in the field of research and the EU strategies to foster growth by ensuring
optimal implementation of the delegated parts of the Horizon 2020 and FP7 Framework
Programmes. During the evaluation period, REA effectively supported its parent DGs in
reaching their objectives in newly mandated activities by undertaking a number of
measures, including:

‒ Implementation of the newly delegated activities and programmes such as SEDIA


which extends the participant validation services to all direct management
operations in the Commission and other EU bodies (grants and procurements) and
the implementation of projects generating EU classified information (EUCI) for
Societal Challenge 7 (for more details refer to the beginning of section 4.1.1 on the
assessment of the key changes);
‒ Introduction of remote evaluations, an automated pre-allocation of proposals to
experts and other improvements to the evaluation process (for more details refer to
section 4.1.2 on simplifications);
‒ Optimisation of its processes to face the substantial increase in the number of
proposals submitted and the introduction of systematic ex ante controls for the FP7
SME actions (for more details refer to Annex 5 on the assessment of REA’s key
success stories and lessons learned);
‒ Revision of internal control system, introduction of improvements in the internal
organisation and HR management practices (for more details refer to section 4.1.2
on the assessment of REA’s HR operations).

33
Please refer to Annex 5 for a detailed overview of the Agency’s key success stories and
lessons learned during 2015-2018.

The lessons learned in the implementation of Horizon 2020

In the absence of formalised modalities and procedures for the implementation of actions
with policy interest in the current MoU, there was an increasing demand for the Agency
to assume the implementation of these new activities in Horizon 2020. Since REA was
implementing very significant parts of several H2020 programmes, they included some of
the actions that possessed a higher degree of policy relevance to the Commission (e.g.
where coordination and Support Actions constituted the largest part of the research
activities funded (e.g SwafS). At the same time, its parent DGs remained responsible for
a small number of highly policy-relevant actions. Throughout the interview programme, a
number of EC officials recognised an increasing demand for the Agency to assume the
implementation of the remaining actions with policy interest to avoid inefficiencies in
maintaining full project implementation capacity at the EC for a fairly small portfolio of
projects. Both the Agency and Commission officials recognised that this arrangement
would require both sides to formalise a new set of implementation modalities for these
actions to allow the Commission to be closely involved in monitoring such projects (due
to their high significance to the Commission’s policy objectives and reputation). At the
same time, REA’s primary role in the administrative implementation of these new
activities would have to be maintained along with the Agency’s overall responsibility as
an Authorising Officer.

The evaluation also identified that another potential area for improvement related to the
implementation of cross-cutting calls. While the Agency has been successful in
implementing some cross-cutting calls by relying on the provisions of the current MoU 28,
both the Agency and EC officials recognised the lack of flexibility in relation to their
implementation. This is particularly noteworthy for the cross-cutting calls which require
different Executive Agencies to pool together their contributing budgets from different
parts of the framework programmes under their management. For instance, REA
experienced some challenges in relation to the implementation of a cross-cutting call in
the area of Blue Growth under Societal Challenge 2 due to the fact that the budget for
this call was to be pooled together from different parts of the framework programme,
managed by three different EAs, i.e. REA, EASME and INEA. It is expected, however, that
the additional degree of flexibility envisaged under the next EU Framework Programme
for Research and Innovation will contribute to the efficiency gains and economies of scale
achieved by the Agency in the implementation of cross-cutting calls.

Decision C(2013) 9418 (Act of Delegation) sets out the Agency’s tasks with regards to
various funding instruments, including grants, tenders and scientific prizes. The
implementation of the funding instruments, such as tenders or scientific prizes, has not,
however, been delegated to REA during the evaluation period. Several interviewed EC
officials (e.g. in DG GROW) suggested that REA could implement these new types of
actions in the new framework programme, in a similar way to what is currently being
implemented by some other Executive Agencies, e.g. EASME and CHAFEA. The
interviewed REA officials were generally receptive to this idea; however, so far there has
been no demand for REA to implement these types of actions and sufficient scale and
number of the new types of actions would need to be implemented to fully exploit the
efficiency gains and economies of scale that REA can provide.

Overall satisfaction with the performance of the Agency

Overall, 86 % of respondents to the survey of beneficiaries were satisfied (42 % of them


– very satisfied) with the services provided by the Agency (for more detail see Figure

28
Memorandum of Understanding specifies that cross-cutting calls are launched in the focus areas of high
relevance to several of priorities or specific objectives of H2020 set out in the H2020 Work Programme (p. 28-
29).
34
10). This result is higher than the overall level of satisfaction reported by the
beneficiaries of CHAFEA (74 %) and comparable to what was reported by the
beneficiaries of EACEA (89 %) and ERCEA (89 %). The overall satisfaction with the
performance of the Agency was lower among EC officials (79 %). Around 55 % of
unsuccessful applicants were overall satisfied with REA’s performance, however, this
result may relate, at least to some extent, to the negative outcome of their application
process (rejection of the application)29.

3,3% 1,5%
Beneficiaries (N=583) 41,5% 44,6% 9,1%

7,0%
EC officials (N=43) 34,9% 44,2% 14,0%
6,1%
6,5%
Unsuccessful applicants (N=262) 6,9% 48,1% 32,4%

0% 10% 20% 30% 40% 50% 60% 70% 80% 90% 100%

Very satisfied Satisfied Neither satisfied nor dissatisfied Dissatisfied Very dissatisfied

Figure 10. Overall satisfaction of beneficiaries, unsuccessful applicants, and EC officials with the performance of the Agency.
Source: Survey of REA’s beneficiaries, unsuccessful applicants and EC officials.

Most beneficiaries of the Agency (98 %) further indicated that they would certainly or
possibly consider applying for REA’s calls or tenders again in the future. This sentiment
was indicated by a similar share of REA’s independent experts. Almost all the experts
(99 %) stated that they would certainly or possibly consider working with REA again in
the future. This has improved since the previous evaluation period of 2012-2015 when
95 % of experts thought that they would certainly or possibly consider working again
with REA again (based on PPMI’s survey conducted in 2015). Similarly, nearly 90 % of
REA’s unsuccessful applicants also reported that they would be willing to work with the
Agency in the future. These findings suggest an overall very strong willingness of the
respondents to continue working with the Agency in the future. Figure 11 breaks down
the data for the related survey questions.

29
Although, applicants do not receive a specific service from REA as their proposals are evaluated by external
experts, the Agency is, however, responsible for the application, evaluation and selection processes as an
authorising officer
35
1%
External experts (N=2342) 91% 8%
1%
Beneficiaries (N=579) 83% 15%
1%
Unsuccessful applicants (N=262) 50% 38% 10%
2%
0% 10% 20% 30% 40% 50% 60% 70% 80% 90% 100%

Yes, certainly Yes, maybe No, probably not No, certainly not

Figure 11. Willingness of beneficiaries, unsuccessful applicants and external experts to work with the Agency again in the
future.
Source: Survey of REA’s beneficiaries, survey of REA’s unsuccessful applicants and survey of REA’s external experts.

To what extent has REA contributed to an improved management of the


programme(s) in terms of the elements assessed in the 2013 Cost–benefit
Analysis?

This question relates to the qualitative aspects of the retrospective CBA indicated in the
ToR and Article 3(1) of Regulation (EC) No 58/2003. As described in our proposal and
this report, these aspects were integrated into the overall evaluation framework, i.e. they
are presented in detail in this report in sections 3, 4.1.1, 4.1.2 and 4.2. Such approach
prevented a duplication of work and ensured an integrated approach throughout the
evaluation exercise. Nevertheless, we also summarised the key findings concerning each
qualitative and quantitative aspect of the CBA identified in Article 3 of Regulation
58/2003 in section 4.2.5 ‘Qualitative aspects of the CBA’ of this report.

To what extent does REA’s communication function support the mission of the
Agency and does REA ensure an effective feedback loop with the policymaking
DGs?

This question relates to the coherence aspects of the evaluation indicated in the ToR.
Thus, it has been integrated into the overall evaluation framework, i.e. presented in
detail in section 4.1.3. Such an approach prevented a duplication of work and ensured an
integrated approach throughout the evaluation exercise.

To what extent has REA contributed to improved management of the


programmes in terms of:
 proximity to addresses?
 visibility of EU as promoter of the programmes entrusted to the Agency?
To what extent and how has REA contributed to an improved management and
visibility of the delegated programmes and better services and satisfaction to
the stakeholders and addressees in terms of the elements assessed in the 2013
CBA and as compared to the alternative options mentioned in that CBA?

REA continued to serve as a direct ‘contact point’ between the


applicants/beneficiaries of EU funding, as well as other stakeholders and the
Commission during the evaluation period. Based on the Memorandum of Understanding
and the results of the beneficiary satisfaction survey conducted in 2015, REA’s external
communication activities focused on three priorities30 during the evaluation period,
including: boosting awareness on new funding opportunities and broadening the
participants' group, consolidating a service-oriented communication, and supporting
parent DGs in giving visibility to EU research via success stories and providing input to

30
REA 2018 Interim Report, p. 70.
36
the policy-feedback loop.

As regards the Agency’s communication with applicants and beneficiaries, REA


organised and participated in numerous information days and coordinators’ days, which
effectively promoted the programmes implemented by REA and ensured their smooth
implementation. The Agency also continuously promoted its funding opportunities and
opportunities for experts (e.g. by creating a campaign to attract new experts to
Horizon 2020 with a video that showcased REA’s evaluation facilities and a series of
animated briefings for experts). REA’s communication officers were in regular contact
with REA’s operational units via the Communication correspondents, who provided
examples of successful and promising projects to be used by the parent DGs in
their external communication activities (e.g. through the articles on projects published on
Europa or Horizon Magazine, participation in international high impact events or briefings
for the missions of the Commissioners, Directors-General, and other
EC
 representatives).

Figure 12 shows that REA’s beneficiaries were largely satisfied with the way the Agency
communicated with them during various phases of the project life-cycle. As many as
81 % of the respondents to the beneficiaries’ survey strongly or rather agreed that REA
staff assigned to their project were easily available and responsive during the
implementation. Most favourable results were received in relation to REA’s availability
and responsiveness during the grant amendment (82 % strongly or rather agreed) and in
the grant finalisation phases (85 % strongly or rather agreed). About 74 % of the
respondents claimed that the feedback received on the scientific and technological
progress in their project was useful.

Compared to the previous evaluation period of 2012-2015, some changes have been
observed with respect to the clarity of information from REA concerning the
administrative requirements. As many as 81 % of the respondents strongly or rather
agreed that the information from REA concerning administrative requirements was clear
compared to 77 % in the previous survey conducted by PPMI in 2015. However, the
satisfaction of beneficiaries with the communication and responsiveness during the
application phase has slightly decreased, as only 66 % reported that they knew who to
contact for any question(s) they had or where to get help when submitting their
application compared to 71 % reported in the previous survey. Even fewer of the
beneficiaries (61 %) agreed that they knew who to contact for any question(s) they had
or where to get help when preparing their application (compared to 69 % in the previous
survey). This can be partly explained by a reinforced EC policy to rule out any contacts
with EC call managing services or pre-proposal checks, referring all needs for assistance
to the NCPs or the Research Enquiry Service.

37
4%
The REA staff assigned to my grant amendment
procedure were easily available and responsive (N=213)
59% 23% 8% 6%

The REA staff assigned to my project in the grant 2%


finalisation and negotiation phase were easily available 52% 33% 8% 5%
and responsive (N=583)
The REA staff assigned to my project during the 3%
implementation were easily available and responsive 50% 31% 10% 6%
(N=269)
5%
The information and advice provided by REA during the
amendment process was clear (N=213)
49% 27% 11% 8%

The feedback I received on the progress with the 4%


content in my project (e.g. in a mid-term review) was 48% 26% 17% 5%
useful (N=241)
3%
Requests from REA (e.g. for proposal modification or
providing missing information) were clear (N=564)
40% 40% 11% 6%

2%
The information from REA concerning administrative
requirements was clear (N=274)
33% 48% 9% 8%

2%
The instructions provided by REA at the beginning of the
granting process were clear (N=585)
30% 50% 12% 6%

2%
I knew who to contact for any question(s) I had or where
to get help when submitting my application (N=590)
27% 39% 21% 11%

4%
I knew who to contact for any question(s) I had or where
to get help when preparing my application (N=590)
25% 36% 22% 13%

0% 10% 20% 30% 40% 50% 60% 70% 80% 90% 100%

Strongly agree Rather agree Neither agree nor disagree Rather disagree Strongly disagree

Figure 12. Satisfaction of beneficiaries with REA’s external communication/responsiveness.


Source: survey of REA’s beneficiaries.

As regards the effectiveness of communication channels used by REA to provide


beneficiaries with relevant information, email was the preferred tool of the respondents
to the beneficiaries’ survey. Altogether 92 % of them said that this channel was useful
to a large or at least to a moderate extent. About 57 % of the surveyed REA
beneficiaries thought that telephone contact was another key communication channel
that, to a large or at least moderate extent, provided them with relevant, helpful
information when they need it. As demonstrated in Figure 13, REA’s website31 and face-
to-face contacts were also seen as quite useful. On the other hand, a large majority of
beneficiaries doubted the relevance of live web and recorded video briefings as well as
REA’s Facebook profile. Some respondents noted that they also relied on the
communication portal within the Participant Portal or their local grants office to obtain
relevant information32.

31
However, the primary source of information for applicants/beneficiaries was the Participant Portal.
32
Based on the feedback received from REA, all official communication related to projects runs through the
Participant Portal in accordance with the provisions of the grant agreement.
38
4%
E-mail contact (N=568) 92%

Telephone contact (N=457) 57% 14% 29% 4%

REA’s website (N=440) 44% 19% 37%

Face-to-face contacts (meetings, events) (N=433) 44% 19% 37%

Live web briefings (with a chat function) (N=296) 12% 12% 76%

Recorded video briefings (N=304) 11% 11% 78%


7%
REA’s facebook profile (N=301) 90%

3% 0% 10% 20% 30% 40% 50% 60% 70% 80% 90% 100%

To a large or moderate extent To some extent To a little or no extent at all

Figure 13. The extent to which the following communication channels used by REA provided the Agency’s beneficiaries with
relevant, helpful information when they needed it.
Source: survey of REA’s beneficiaries.

Communication with external experts

REA effectively informed its network about the opportunity to work in the capacity of an
external expert. As many as 80 % of the surveyed independent experts believed that
the information on how to become an external expert was easily available; 12 % of
stakeholders were neutral on this issue. Recommendation by colleagues and superiors in
their institutions was mentioned by 26 % of the surveyed external experts as the key
channel to learn about the opportunity to become an expert. Other key channels
included the European Commission website (19 %), participation in another research
project supported by FP7/H2020 (18 %) and relevant national sources such as research
ministry, EU liaison office at their university, etc. (15 %).

External experts were very satisfied with the information provided to them online by REA.
The H2020 online manual for experts was seen as especially useful by 94 % of external
experts responding to the survey. FAQ and other reference documents were also
considered very to fairly useful by respectively 93 % and 89 % of the respondents (see
Figure 14).

1%
H2020 online manual (N=1350) 94% 5%
2%
FAQ for experts (N=1354) 93% 5% 2%

Reference documents (N=1248) 89% 9%


7%
IT Helpdesk (N=837) 77% 16%
6%
Research Enquiry Service (RES) (N=792) 75% 19%

0% 10% 20% 30% 40% 50% 60% 70% 80% 90% 100%

Very or fairly useful Neither useful nor not useful Not very useful or not useful at all

Figure 14. Independent experts’ assessment of the usefulness of information provided online.
Source: survey of REA’s external experts.

External experts who worked with REA during the evaluation period were largely satisfied
with the Agency’s communication during every stage of their assignment. It is notable
that over 95 % of external experts strongly or rather agreed that REA staff with whom
39
they worked were responsive and provided them with answers to their questions (refer to
Figure 15 below). Around 90 % of the external experts were also positive about the
clarity of tasks and procedures, who to contact with questions, etc.

1%
The REA staff with whom I worked with provided useful
answers to my questions (N=2216)
96% 3%

1%
The REA staff with whom I worked with were responsive
(e.g. by email or phone) (N=2270)
95% 4%

3%
Tasks I had to carry out were clearly stated in the
contract (N=2396)
92% 5%

I was appropriately briefed on the requirements for my


4%
work (esp. through the telephone conference briefings) 91% 5%
(N=2230)
3%
It was clear to me how to evaluate and rate
proposals/monitor project activities (N=2438)
89% 8%

3%
Information provided by REA was clear and sufficient
(e.g. guides for evaluators, guidance on how to use the
electronic evaluation system, briefing documents)
86% 11%
(N=2583)
3%
I knew either who to contact or where to get help
regarding any questions I had when working on my tasks 82% 15%
(N=2618)

0% 10% 20% 30% 40% 50% 60% 70% 80% 90% 100%

Strongly or rather agree Neither agree nor disagree Rather or strongly disagree

Figure 15. Satisfaction of independent experts with REA’s external responsiveness and competence.
Source: survey of REA’s external experts.

Concerning visibility of the EU as a promoter of the programmes entrusted to REA, we


looked at the level of understanding among stakeholders that (1) the EU/European
Commission is the promoter of all programmes delegated to REA and (2) the Agency is
acting under powers delegated by the Commission. Figure 16 shows that a large majority
of beneficiaries related the grants and tenders implemented by REA both to the EU
budget (95 %) and to the European Commission as an institution (84 %). Over 80 % of
the respondents strongly or rather agreed with the view that the programmes
implemented by REA were well advertised.

40
3% 2%
When applying for this grant, I was aware that the REA
grants were funded from EU budget (N=230)
95%
4%
When applying for this grant, I was aware that REA was
entrusted to manage its grants by the European 84% 12%
Commission (N=228)
6%
Overall, funding opportunities for the REA-managed
programmes are well advertised (N=227)
81% 13%

0% 10% 20% 30% 40% 50% 60% 70% 80% 90% 100%

Strongly or rather agree Neither agree nor disagree Rather disagree or strongly disagree

Figure 16. Beneficiaries’ views regarding visibility of EU as promoter of the programmes entrusted to REA.
Source: survey of REA’s beneficiaries.

According to our survey of beneficiaries, there were four key channels used to learn
about the EU research grants: recommendations by colleagues or superiors (21 %),
Research Participant Portal (17 %), European Commission website such as FP7/H2020
portal, REA website, CORDIS (17 %) and the National Contact Point (13 %).

To what extent has REA contributed to improved management of the


programmes in terms of effective implementation of the programmes, taking
into account the interests of the addressees and those of the EU:
i. rate of execution of commitment appropriations;
ii. rate of execution of payment appropriations;
iii. time-to-grant;
iv. net time-to-pay;
v. residual multi-annual error rate identified at ex post control.

Overall, REA was efficient in managing the delegated programmes and achieved very
good results in terms of most Key Performance Indicators (KPIs). Compared to the
previous years the Agency’s performance improved during the 2015-2018 period.

REA further improved its performance in terms of timely conclusion of grant agreements
(measured in ‘Time-To-Grant’ – TTG) both over the evaluation period and, especially,
when compared with the previous years. The average REA TTG decreased from 351 days
in 2010 to 193 days in 2016-2018, 52 days (over 20 %) below the H2020 target. The
share of grants concluded within TTG targets reached 99 % for 2015 calls and nearly
100 % for 2016-2017 calls.

Regarding payments to grants, the average Time-to-Pay (TTP) stood well below the
contractual thresholds for all types of payments (pre-financing, interim and final
payments) in 2015-2018 both for FP7 and H2020. Concerning the share of payments
within contractual limits, nearly 100 % of all pre-financing payments in 2015-2018 were
executed on time. With regard to interim and final payments, REA maintained a similar
performance level compared to the previous evaluation period for FP7 (94 % of
payments executed on time) and during 2017-2018 improved its performance for H2020
(98 % of payments executed on time).

Similar to the previous years, REA managed to achieve full execution of its operational
budget both in commitment and payment appropriations during 2015-2018.

With regard to the legality and regularity of transactions, the estimated residual multi-
annual error rate remained in a similar range over 2014-2018 compared to those in the
previous years for all FP7 programmes implemented by REA. The residual multi-annual
error rate for the SME actions, Space and Security was above that materiality threshold
of 2 %, while that for the People Programme remained below 2 %. In addition, it could
be noted that the estimated error rate could be attributed, to a large extent, to the
41
complexity of the funding schemes, which is generic to FP7 as a whole; therefore, these
error rates remained in line with the general FP7 trend.

A detailed analysis of REA’s performance with respect to the execution of the


programmes’ implementation tasks and relevant KPIs is presented under the next section
of the report covering efficiency.

4.1.2 Efficiency

When evaluating operational efficiency, we analysed REA’s processes, services and


products, as well as the allocation and use of available financial resources. This task also
covered aspects related to key (financial and non-financial) performance results of the
Agency’s operations, human resources and the organisational structure of REA.

Operational question: To what extent is REA operating efficiently with respect


to timely execution of programme management functions and other KPIs?
Which aspects/means/actors or processes render REA more or less efficient?
What could be improved? What is the quality of the services/advice provided by
REA to stakeholders and addressees?

Our evaluation of the overall efficiency in REA’s performance was primarily based on the
analysis and interpretation of the key performance indicators related to:

 Timely execution of the delegated functions;

 Cost efficiency of the management and control arrangements;

 Effectiveness of the established supervisory and control systems in ensuring the


legality and regularity of the programmes’ expenditure;

 Budget execution of commitment and payment appropriations.

This evaluation question also addressed the efficiency of REA during specific stages of the
project life-cycle – the efficiency and transparency of the evaluation and selection
process, the efficiency of conclusion of grant agreements, as well as the efficiency
and effectiveness of the follow-up, monitoring and control of grant
implementation procedures.

The previous evaluation study (2012-2015) of REA33 found that REA was efficient in
managing the delegated programmes and achieved good results in terms of most KPIs.
Compared to the previous years, the Agency’s performance improved during the 2012-
2015 period.

Evaluation and selection of proposals, conclusion of grant agreements

The overall objective of the proposal evaluation stage is to ensure scientific excellence
(selection of the best projects) and timely communication of the selection results to the
applicants. Panels of external reviewers, who are experts in the scientific field, review
proposals. The selected proposals then enter the contracting phase, which is completed
with the signature of grant agreements. At this stage, REA’s objectives are to ensure that
grant agreements comply with legality and regularity requirements and are concluded
within the set time limit.

Generally, REA improved its performance in terms of timely conclusion of grant


agreements (measured in ‘Time-To-Grant’ – TTG) both over the evaluation period and,

33
PPMI (2016). Evaluation of the Operation of REA (2012-2015). Final report.
42
especially, when compared with the previous years (refer to Figure 17). The average REA
TTG decreased from 222 days in 2014 to 203 days in 2015 and stabilised at 193 days
during 2016-2018, 52 days (over 20 %) below the H2020 target.

400 351
350
275
300 255 244
222 245
250 203 193 193 193
Days

200
150
100
50
0
2010 2011 2012 2013 2014 2015 2016 2017 2018 (1st
semester)
Year and Programme

Average TTG H2020 TTG Target

Figure 17. REA’s performance in terms of the average TTG, 2010-2018.


Source: compiled by PPMI based on the AARs of REA.

According to Article 20 of the H2020 Rules for Participation, REA has 8 months between
the call deadline and signature of grants. This consists of two periods:

 Time-To-Inform (TTI): for informing applicants of the outcome of the scientific


evaluation of their application, a maximum period of 5 months (153 days) from the
final date for submission of complete proposals is set;

 Time-To-Grant (TTG) for signing grant agreements with applicants or notifying grant
decisions to them, a maximum period of 8 months (245 days) from the call deadline is
set.

REA was very successful in reaching TTI targets in 2015-2018: 96 % of the participants
were informed within 153 days in 2015 and 100 % in 2016-2018. The average TTI
decreased from 142 days in 2015 to 132 days in 2016 and further to 130 days in 2017.

With regard to the overall TTG target, 97 % of grants of 2014 calls were concluded within
245 days. For the 2014 SwafS calls, there were delays both in TTI and in TTG for all
grants, which resulted from the complexity of the transfer of these actions from the
parent DG RTD to REA, the necessity to reconvene the panel for one of the calls and the
processing of files for consultation of the Programme Committee 34. Nevertheless, learning
from this experience, REA and DG RTD worked together to streamline their cooperation
and ensured that the TTG target was met for all grants of 2015-2017 SwafS calls. For the
SC7 Security calls the TTG delays primarily related to the extensions granted to
applicants, at their request, in order to finalise their obligations, complex ethics review,
etc.

The share of grants concluded within the TTG target rose to 99 % for 2015 calls and to
nearly 100 % for 2016-2017 calls, whereas grants concluded exceeding TTG limits
mostly related to extensions granted to the beneficiaries, at their request, or to specific
circumstances (such as security scrutiny, very complex ethics review, etc.).

34
REA 2015 Annual Activity Report.
43
100%
90%
80%
70%
60%
50%
40%
30%
20%
10%
0%
MSCA Fet Open Space SC 2 SC 6 SC 7 SEWP SWaFS All
Calls 2014 97% 100% 97% 94% 91% 88% 91% 0% 97%
Calls 2015 100% 96% 97% 100% 97% 92% 100% 100% 99%
Calls 2016 100% 95% 97% 98% 91% 95% 100% 100% 100%
Calls 2017 100% 100% 100% 98% 100% 91% 100% 100% 100%

Calls 2014 Calls 2015 Calls 2016 Calls 2017

Figure 18. REA’s performance in terms of TTG (the share of grants concluded within the TTG target) per activity and year of the
call, 2014-2017.
Source: compiled by PPMI based on the AARs of REA.

The results of the beneficiaries’ survey showed (refer to Figure 19) that 83 % of REA’s
beneficiaries were satisfied with the timeliness of the evaluation and selection of
proposals35, a similar 84 % were satisfied with the timeliness of contracting36 and 80 %
with the overall time period between the call deadline and signature of grant
agreement37. Such level of satisfaction of the beneficiaries is significantly higher
compared to the results of the previous beneficiary survey (where satisfaction with TTI
stood at 77 %, TTC – 76 % and TTG – 72 %)38, which clearly reflects improved
performance of REA in terms of TTG-related indicators. The survey of unsuccessful
applicants showed that 71 % of them were satisfied with the timeliness of evaluation and
selection of proposals, while another 12 % were neutral.

5%
The time period from the call deadline to the time the 1%
outcome of the proposal was announced to you (i.e. 36% 47% 11%
time-to-inform) was appropriate (N=606)

The time period from the announcement of your 6% 2%


proposal’s outcome to the time you signed the contract 35% 49% 8%
(i.e. time-to-contract) was appropriate (N=604)
7%
The overall time period from submission of the proposal 1%
to signature of the grant agreement (i.e. time-to-grant) 32% 48% 12%
was appropriate (N=605)

0% 10% 20% 30% 40% 50% 60% 70% 80% 90% 100%

Strongly agree Rather agree Neither agree nor disagree Rather disagree Strongly disagree

Figure 19. Satisfaction of beneficiaries with the performance of REA in relation to timeliness of the evaluation, selection and
contracting process.
Source: survey of REA’s beneficiaries.

In order to maintain efficiency and transparency of the programming and call for
proposals process, it is important to ensure that information for applicants is easy to
find and clear. The results of the beneficiaries’ survey showed that as many as 88 % of
respondents agreed that information for applicants was easy to find and around 85 %

35
TTI.
36
TTS.
37
TTG.
38
The survey of REA’s beneficiaries carried out under Evaluation of the operation of REA (2012-2015).
44
claimed that it was clear. These results are similar to the results of the 2015 REA
beneficiaries’ survey where 85 % of respondents agreed that information for applicants
was easy to find and 84 % claimed that it was clear). The survey of unsuccessful
applicants showed that although their level of satisfaction was lower compared to the
beneficiaries’ survey, it remained rather high – 75 % of respondents agreed (another
13 % were neutral) that information for applicants was easy to find and 70 % (another
12 % were neutral) that it was clear.

5% 1%

Information for applicants was easy to find (N=613) 38% 50% 6%


5% 1%

Information for applicants was clear (N=614) 36% 49% 10%


4%
8%
The requirements for application process were
reasonable and proportionate (N=610)
27% 47% 14%
7% 3%
The proposal templates were well structured and easy
to follow (N=610)
28% 49% 13%

11% 3%
The electronic tool used for submitting the application
was user-friendly (N=605)
27% 44% 15%

0% 10% 20% 30% 40% 50% 60% 70% 80% 90% 100%

Strongly agree Rather agree Neither agree nor disagree Rather disagree Strongly disagree

Figure 20. Satisfaction of beneficiaries with the application process.


Source: survey of REA’s beneficiaries.

Around 74 % of respondents to the beneficiaries’ survey agreed that the requirements


for the application process were reasonable and proportionate, which is an
improvement compared to the previous survey (69 %). In addition, a slightly higher
share of beneficiaries (77 %) agreed that the proposal templates were well
structured and easy to follow. Satisfaction among unsuccessful applicants with these
aspects was lower compared to the beneficiaries, as 59 % of respondents agreed
(another 18 % were neutral) that the requirements for the application process were
reasonable and proportionate and 64 % of them agreed (another 20 % were neutral)
that the proposal templates were well structured and easy to follow.

Satisfaction with the user-friendliness of the electronic systems used for submitting
the applications reached 71 % among beneficiaries and 68 % among unsuccessful
applicants (another 18 % being neutral). The level of satisfaction has not changed
compared to the previous survey of H2020 beneficiaries (71 %). It is important to note
that the proposal templates and submission systems are not under the control of REA as
they are centrally established and provided by the EC services for the whole Framework
Programme.

45
4% 1%
The results of my application via the Participant Portal
were easy to access (N=598)
44% 43% 8%
6% 1%
The individual reviews and panel comments provided
were clear (N=601)
34% 46% 13%
3%
The individual reviews and panel comments provided 6%
were useful in understanding the strengths and 31% 46% 14%
weaknesses of my proposal (N=601) 2%
4%
Overall, the evaluation process was transparent (N=596) 37% 44% 13%

0% 10% 20% 30% 40% 50% 60% 70% 80% 90% 100%

Strongly agree Rather agree Neither agree nor disagree Rather disagree Strongly disagree

Figure 21. Satisfaction of beneficiaries with the evaluation process.


Source: survey of REA’s beneficiaries.

Overall satisfaction with the transparency of the evaluation process among


beneficiaries reached 81 %, which is substantially higher compared to the results of the
previous survey (72 %). A total of 87 % of respondents to the beneficiaries’ survey
agreed that the outcome of the evaluation of their application was easy to access via the
Participant Portal. Concerning the quality of feedback provided on the evaluation results,
80 % of respondents to the beneficiaries’ survey agreed that the individual reviews and
panel comments provided were clear and 77 % that these comments were useful in
understanding the strengths and weaknesses of the proposal.

The results of REA’s unsuccessful applicants’ survey revealed that they were substantially
less satisfied with the quality of the evaluation process and feedback received compared
to the beneficiaries, which could relate, at least to some extent, to the negative outcome
of their application process (rejection of the application). Based on the results of
unsuccessful applicants’ survey, 40 % were satisfied (another 27 % being neutral) with
the overall transparency of the evaluation process, 43 % of respondents agreed that the
individual reviews and panel comments provided were clear (another 21 % being neutral)
and 32 % (24 % being neutral) that these comments were useful in understanding the
strengths and weaknesses of the proposal. Under the previous REA beneficiaries’ survey
54 %39 of beneficiaries whose applications were not successful in the previous
submissions stated that the explanation provided for the application not being selected
was clear. For a detailed overview of the suggestions made by the unsuccessful
applicants on how the evaluation process could be improved, please refer to the
summary of the unsuccessful applicants’ survey results in Annex 2.

The survey of EC officials revealed that 55 % of respondents were of the opinion that the
evaluation process implemented by REA always ensured that the proposals best
addressing the specific research topics were selected for funding, although the level of
satisfaction varied40 (refer to Figure 8 in section 4.1.1). The survey showed that
respondents to the EC officials’ survey were very satisfied with the timeliness of REA’s
evaluation process (refer to Figure 22). A slightly lower level of satisfaction related to the
transparency and quality of the evaluation process.

39
Share of responders who strongly or rather agree that the explanation provided for the application not being
selected was clear. This survey question was addressed only to beneficiaries whose applications were not
successful in the previous submissions.
40
55 % of the respondents stated that the Agency always implemented a process which ensured that the
proposals best addressing the specific research topics under their areas of responsibility were selected for
funding. Another 37 % of the respondents respectively thought that the Agency was able to ensure that the
best proposals were selected for funding most of the time (29 %) or sometimes (8 %).
46
Timeliness of evaluations was appropriate (N=38) 90% 5% 5%

The evaluation process was transparent (N=34) 73% 12% 6% 9%

Quality of evaluations was appropriate (N=34) 53% 32% 9% 6%

0% 10% 20% 30% 40% 50% 60% 70% 80% 90% 100%

To a large extent To a moderate extent To some extent To a little extent Not at all

Figure 22. Satisfaction of EC officials with the evaluation process organised by REA.
Source: survey of EC officials.

The H2020 evaluation review procedure41 was set up to provide the possibility for
applicants to file a complaint if they think that there were shortcomings in the handling of
their proposal during the evaluation. A committee analyses all complaints and, where
appropriate, may recommend the re-evaluation of the proposal. The indicator on the
evaluation review procedure helps to monitor the quality and effectiveness of the
proposal evaluation process. REA’s efforts in improving the quality of the evaluation
process and the Evaluation Summary Reports (ESRs) was reflected in a reduced share of
evaluation review/redress cases filed and (fully or partially) upheld compared to the
number of proposals evaluated (refer to Figure 23). The share of evaluation
review/redress cases filed decreased from 3.0 % for the 2011 calls to 1.2 % for the 2017
calls. Similarly, the share of evaluation review/redress cases upheld decreased from
0.8 % for the 2011 calls to 0.3 % for the 2017 calls and for the 2014-2017 calls it stayed
below the maximum target of 0.5 %. Two proposals were funded after re-evaluation for
the 2015 calls, also one proposal was re-evaluated positively in the first stage (but failed
in the second stage) for the 2017 calls. These numbers are very low compared to the
total population of the proposals received and selected during the evaluation period.

Share of evaluation review cases / number of proposals


evaluated
4,00%
3,00%
2,50% 2,60%
3,00%

2,00% 1,59% 1,57% 1,39% 1,21%


0,80% 0,60%
1,00% 0,34% 0,27% 0,46% 0,39% 0,31%
0,00% 0,50%
WP 2011 - FP 7 WP 2012 - FP 7 WP 2013 - FP 7 WP 2014 - WP 2015 - WP 2016 - WP 2017 -
H2020 H2020 H2020 H2020

Share of evalutation review/redress cases filed Share of evalutation review/redress cases upheld
Maximum target (for upheld cases)

Figure 23. Share of evaluation review/redress cases (%).


Note: the share of evaluation review/redress cases filed/upheld is calculated in relation to the number of eligible proposals.
Source: compiled by PPMI based on the AARs of REA.

The interview programme and REA’s AARs indicated that the Agency paid constant
attention to improving the quality of the evaluation process. Since 2015 REA uses the IT
tool (PEAS allocation module in SEP), which allows an automated pre-allocation of
proposals to experts based on the best match of keywords related to the scientific field of

41
Formerly called ‘redress procedure’ under FP7.
47
the proposal and the scientific profile of the expert evaluators. The pre-allocation can be
checked and updated manually. In addition, REA took specific measures to automate and
improve the detection of possible conflicts of interest for experts. More information on
these issues is provided in the analysis of simplifications within this section.

The REA beneficiaries’ survey revealed that 87 % of the respondents were satisfied with
the overall grant conclusion process (refer to Figure 24). Satisfaction with the clarity
of instructions provided by REA at the beginning of the granting process reached 80 %.
Same 80 % of respondents agreed that the requests from REA (e.g. for proposal
modification or providing missing information) were clear, which was very similar to the
results of the previous survey of REA beneficiaries (79 %). The lowest level of
satisfactions was related to the user-friendliness of the IT tools used in the grant
conclusion stage (66 %), this issue has not changed compared to the previous REA
beneficiary survey (65 %) and relates to corporate tools that are not under the control of
REA.

2%
The instructions provided by REA at the beginning of the
granting process were clear (N=585)
30% 50% 12% 6%
3%
Requests from REA (e.g. for proposal modification or
providing missing information) were clear (N=564)
40% 40% 11% 6%
4%
The electronic tools used in the negotiation/contracting
process were user-friendly (N=574)
24% 42% 20% 10%
1%
Overall, the granting process was transparent (N=586) 46% 41% 9%
3%
0% 10% 20% 30% 40% 50% 60% 70% 80% 90% 100%

Strongly agree Rather agree Neither agree nor disagree Rather disagree Strongly disagree

Figure 24. Satisfaction of beneficiaries with the contracting process.


Source: the survey of REA’s beneficiaries.

Follow-up, monitoring and control of grant implementation, payments

During the grant implementation stage, REA aims to ensure that the projects remain on
track in terms of their performance, budget execution, legality and regularity of the
transactions.

As the interim evaluation period covers the period from mid-2015 to mid-2018, REA’s
grant implementation activities (interim payments, monitoring and follow-up activities,
etc.) evolved with respect to the programmes’ coverage. Of interim and final payments in
2015-2016, 96 % were related to FP7, however in 2017 the share of H2020 interim and
final payments rose to 41 % of all payments and in 2018 the number of H2020 interim
and final payments surpassed the number of FP7 payments.

100
90
80 69,1 72,6
61,4 65,1 62,7 65,2
70 57 56,1
60 54,3
Days

50
40
30
20 12,7 11,9 9,6 11,4 9,2 11
10
0
2013 2014 2015 2016 2017 2018 2013 2014 2015 2016 2017 2018 2016 2017 2018
Pre-financing FP7 (Interim and final payments) H2020 (Interim and
final payments)

Average net TTP Net TTP target (30 days) Net TTP target (90 days)

48
Figure 25. REA’s performance in terms of the average TTP, 2013-201842.
Source: compiled by PPMI based on the AARs of REA.

Regarding payments to grants, the average Time-to-Pay (TTP) stood well below the
contractual thresholds for all types of payments (pre-financing, interim and final
payments) in 2015-2018 both for FP7 and H2020 (refer to Figure 25).

Concerning the total number of payments within contractual limits in 2015-2018, nearly
100 % of all pre-financing payments were executed on time. With regards to interim and
final payments, REA maintained a similar performance level compared to the previous
evaluation period for FP7 (94 % of payments executed on time, the average TTP grew
slightly during 2015-2018, which could relate to an increasing share of final payments,
which are more time consuming to verify). During 2017-2018, it improved performance
for H2020 (98 % of payments executed on time) (Table 6)

Table 6. REA’s performance in terms of TTP, 2012-2018.

Expenditure type Number of Average Average Share of grant


payments TTP (net) TTP (gross) payments made on
made time43

Pre-financing TTP target =


payments 30 days

2012 2,035 No data No data 98 %

2013 2,266 12.7 49.4 97 %

2014 1,786 11.9 48.5 99 % for FP7 and 100


% for H2020

2015 1,793 9.6 24.5 97 % for FP7 and 99 %


for H2020

2016 FP7 12 23.1 139.5 92 %

H2020 1,768 11.3 11.3 99 %

2017 FP7 4 18.3 32.5 100 %

H2020 1,700 9.2 9.2 100 %

2018 1st H2020 844 11 100 %


semester

Interim and final TTP target =


payments 90 days

2012 2,330 No data No data 90 %

2013 2,392 54.3 81.7 94 %

2014 2,963 57.0 84.9 96 %

2015 2,90344 61.4 95.4 94 % for FP7 and 92 %

42
1st semester of 2018.
43
Target = 30 days for pre-financing payments and 90 days for interim and final payments.
44
Of which 52 for H2020.
49
for H2020

2016 FP7 2,819 65.1 107.5 94 %

H2020 202 62.7 86.1 90 %

2017 FP7 1,561 69.1 123.6 94 %

H2020 1,082 56.1 75.8 98 %

2018 FP7 490 72.6 137.7 94 %

H2020 838 65.2 86.2 98 %

Source: REA’s AARs 2012-2017, REA’s Interim Report 2018.

The actual time elapsed between the submission of the payment claim and the transfer of
funds by the Agency is measured by gross TTP, which shows the payment processing
time from the beneficiary’s perspective. Gross TTP could be significantly longer, as delays
by beneficiaries in providing additional information in response to requests from REA for
additional documents or clarifications are discounted from the net TTP (the ‘stop-the-
clock’ mechanism is applied).

The difference between net and gross TTP in the case of interim and final payments is
mostly related to the complexity of deliverables (including cost claims) that are
submitted. As seen in Table 6, the average gross TTP for interim and final payments
during 2015-2018 exceeded the average net TTP by 43 days (66 % of net TTP) for FP7
and 21 days (34 % of net TTP) for H2020, better performance of REA in relation to
H2020 gross TTP was facilitated by the fully electronic grant management H2020 IT
tools.

Very good performance of REA in terms of timely processing of payments was also
reflected in the results of the beneficiaries’ survey – 94 % of the beneficiaries were
satisfied with the time period it took REA to make pre-financing payments and 89 % of
them with the time period it took REA to process interim and final payments. These
results show a significant improvement compared to the previous REA beneficiaries’
survey, where 78 % of beneficiaries were satisfied with the timelines of interim payments
and only 64 % – with the timeliness of final payments.
1%
For pre-payment: the time it took the Agency to make 5%
the payment was appropriate (N=456)
59% 35%
3% 1%
For interim payments: the time it took the Agency to
process payment requests and make payments was 41% 48% 7%
appropriate (N=228) 3%
For the final payment: the time it took the Agency to
process the payment request and make payment was 44% 45% 8%
appropriate (N=71)
0% 10% 20% 30% 40% 50% 60% 70% 80% 90% 100%

Strongly agree Rather agree Neither agree nor disagree Rather disagree Strongly disagree

Figure 26. Satisfaction of beneficiaries with the performance of REA in relation to timeliness of the payment process.
Source: survey of REA’s beneficiaries.

During 2015-2018, REA significantly improved its performance in terms of timely


processing of requests for grant amendments submitted by beneficiaries. REA was
well below the Time-To-Amend (TTA) contractual target of 45 days measured against net
TTA in 2015-2018 (refer to Figure 27), TTA performance for H2020 grants was

50
significantly better compared to FP7, especially in terms of gross TTA, which is more
important from the beneficiaries’ perspective.

80 69
68
70
60 55
51
50
45
Days

34 36
40 32
27 28
30 24
17 19 17
20 13 12 14
10
0
2014 2015 2016 2017 2018 (1st 2015 2016 2017 2018 (1st
semester) semester)
FP7 H2020
Programme and Year

net TTA gross TTA net TTA target

Figure 27. REA’s performance in terms of the average TTA, 2014-2018.


Source: compiled by PPMI based on the AARs of REA.

Although REA was complying well with the contractual TTA targets, a moderate 68 % of
beneficiaries were satisfied with the time period it took REA to process grant amendment
requests (refer to Figure 28). However, this level of satisfaction slightly improved
compared to the previous survey of REA beneficiaries (64 %). A similar 67 % of
beneficiaries were satisfied with the fluency of the overall grant amendment process; a
higher level of satisfaction (76 %) related to clarity of information and advice provided by
REA during the amendment process. A number of respondents to the beneficiaries’
survey claimed that the grant amendment procedure was too heavy and time consuming,
especially for small grant amendments.

The information and advice provided by REA during the


amendment process was clear (N=213)
49% 27% 10% 9% 5%

The time it took REA to process grant amendment


requests was appropriate (N=208)
38% 30% 13% 11% 8%

Overall, the grant amendment process was smooth


(N=209)
38% 29% 11% 11% 11%

0% 10% 20% 30% 40% 50% 60% 70% 80% 90% 100%

Strongly agree Rather agree Neither agree nor disagree Rather disagree Strongly disagree

Figure 28. Satisfaction of beneficiaries with the performance of REA in relation to the grant amendment process.
Source: the survey of REA’s beneficiaries.

The results of the REA beneficiaries’ survey revealed that beneficiaries were generally
satisfied with most aspects related to reporting and monitoring of grants. Nearly 80 %
of the respondents to the beneficiaries’ survey agreed that technical and financial
reporting requirements were clear and a similar share of them that the process of project
monitoring by REA was smooth (76 %) and transparent (78 %). In addition, 75 % of the
respondents agreed that feedback received from REA on the progress of projects was
useful (see Figure 29). These results showed improvement compared to the 2015 REA
beneficiaries’ survey, where 76 % of the respondents agreed that project reporting
requirements were clear, 70 % that the process of project monitoring by REA was clear
and transparent and 75 % that the feedback received from REA on the progress of
51
projects was useful. Over 80 % of the respondents to the beneficiaries’ survey agreed
that the periodic reporting requirements were appropriate to the level of activities in their
project, which also improved compared to 2015, where 77 % of the respondents agreed
that the requirements for project reporting were reasonable and proportionate.

The feedback I received on the progress with the 4%


content in my project (e.g. in a mid-term review) was 48% 27% 17% 4%
useful (N=241)
8%
Technical reporting requirements were clear (N=265) 37% 42% 13% 6%

2%
Financial reporting requirements were clear (N=251) 36% 42% 13% 7%

Overall, the periodic reporting requirements were 3%


appropriate to the level of activities in my project 36% 45% 9% 7%
(N=269)
8%
The electronic tools used for managing my grant were
user-friendly (N=271)
27% 35% 16% 14%
6%
The process of monitoring my project by REA was
smooth (N=270)
39% 37% 13% 6%
3% 3%
The process of monitoring my project by REA was
transparent (N=264)
46% 32% 16%

0% 10% 20% 30% 40% 50% 60% 70% 80% 90% 100%

Strongly agree Rather agree Neither agree nor disagree Rather disagree Strongly disagree

Figure 29. Satisfaction of beneficiaries with the reporting and monitoring processes.
Source: The survey of REA’s beneficiaries.

Similar to other stages of the project life-cycle and results of the 2015 survey, the lowest
level of satisfaction was related to the user-friendliness of the IT tools employed for grant
management (62 %); these results were slightly lower compared to the results of the
2015 survey (66 %).

It is important to note that the IT tools play an increasingly important role in programme
and grant management. During the evaluation period these tools became more complex
and embedded more processes, allowed paperless workflows, etc. Analysis showed that
the development of the IT tools contributed to improved delivery of programme
management functions (improved TTG, TTP, TTA, etc.) and a growing level of satisfaction
among beneficiaries related to most processes during the grant life-cycle (application,
contracting and grant management). On the other side, demands for IT tools and their
user-friendliness also grew, which could explain, at least to some extent, the relatively
low level of satisfaction among beneficiaries with the user-friendliness of the IT tools.

The share of projects that achieved all or most of their objectives could reflect the
qualitative aspects of both the grant selection process and the grant management as well
as the follow-up process. REA’s Annual Activity Reports showed that over 95 % of FP7
projects that closed in 2015-2018 achieved all or most of their objectives. This is
significantly higher than REA’s AWP target of 90 %. In addition, it could be observed that
the share of FP7 projects reaching all or most of the objectives declined slightly during
2012-2017. This evolution could relate to the fact that the most complex projects are
usually finalised towards the end of a Framework Programme 45.

45
REA 2017 Annual Activity Report.
52
FP7 Projects which reached all or most of the objectives
100% 98,0%
98% 96,8% 96,5% 96,2% 95,8% 95,7% 96,0%
96%
94%
92%
90% 90%
88%
86%
2012 2013 2014 2015 2016 2017 2018 (1st
semester)
Year

Projects which reached all or most of the objectives Target

Figure 30. Share of closed projects which reached all or most of their objectives in 2012-2018.
Source: Compiled by PPMI based on the AARs of REA.

Execution of the operational budget commitments

As in the previous evaluation period, during 2015-2017 REA managed to achieve full
execution of its operational budget both in commitment and payment appropriations.

Legality and regularity of the programmes’ expenditure

FP7

REA set up internal control processes aimed at ensuring adequate management of the
risks relating to the legality and regularity of the underlying transactions, taking into
account the multi-annual character of the programmes as well as the nature of the
payments concerned. The related control objective is to ensure that the residual error
rate does not exceed 2 % (materiality threshold) on a cumulative basis by the end of
each programme implementation. The starting point for the calculation of the residual
error in the Research Family is the Common Representative Audit Sample (CRaS), which
aims at estimating on a multi-annual basis the error rate at the level of the Research
Family, across all the services involved in their management, provided that the risk
profile of the schemes implemented are comparable.

The Space and Security themes of the Cooperation Programme managed by REA were
implemented according to the general FP7 funding rules. As a result, the CRaS
representative error rate was used as a basis for calculating the residual error rate for
Space and Security research actions implemented by REA. The residual error rate for the
Space programme at the end of June 2018 was estimated at 3.32 % and for the Security
programme at 3.66 % (refer to Figure 31).

The Research for the benefit of SMEs actions of the Capacities Programme and the People
Programme implemented by REA have a different risk profile than other FP7
programmes. For these activities, the CRaS representative error rate alone cannot be
taken as a reference and REA is relying on detected error rates from a wider range of
control data in order to provide indications of the likely residual error rates for each of
these programmes. The residual error rate at the end of June 2018 was estimated at
5.78 % for the SMEs actions and 1.55 % for the People Programme.

With respect to the SMEs actions, an additional exposure related to a recurrent error
whereby SMEs were not complying with one of the formal eligibility criteria for their
declared costs for subcontracting: the majority of the funding under this scheme was
directed to the outsourcing of research activities to RTD performers. This funding was
53
sometimes channelled directly from the coordinator to the RTD performers rather than
‘transiting’ through the SMEs to whom services were delivered. While such cash flows
could be allowed, some SMEs failed to ensure that the formal eligibility requirement set
out in the grant agreement (namely that these costs declared under the grant agreement
are duly recorded in their accounts) was respected. Non-compliance with this contractual
requirement may lead to the recovery of funds (if not corrected), even where non-
recording of RTD costs in the SME’s accounts relates to a rather formal requirement and
does not imply that funds were used for purposes other than intended (contracting of
RTD performers). It appeared from desk audit campaigns launched by REA in 2014 and
2015 that this risk had a high prevalence with a financial exposure of more than 10 %.
Therefore, REA decided that from the beginning of 2015 specific additional ex ante
controls would be systematically performed to ensure the appropriate registration of RTD
performers' invoices in the SMEs' accounts, before making any final payment, which
allowed for a reduction in financial exposure 46. For more details on the introduction of the
additional ex ante controls, refer to Annex 5 for the assessment of REA’s key success
stories and lessons learned.

The estimated residual error rate remained in a similar range over 2014-2018 for all FP7
programmes implemented by REA. Although the residual error rate exceeded the
materiality threshold for FP7 Space, Security and SME actions, this could be attributed to
a large extent to the complexity of the funding schemes. Moreover, error rates higher
than the materiality threshold of 2 % were also found in most other parts of FP7.
Therefore, increasing controls in order to reach the 2 % materiality criteria would require
essential changes in programme implementation arrangements, which according to REA’s
AARs would not be cost-effective.

Residual error rate


7,00%
6,00%
5,00%
4,00%
3,00%
2,00%
1,00%
0,00%
Space Security SME People
2014 2,08% 2,08% 5,13% 1,08%
2015 3,12% 3,12% 6,30% 0,76%
2016 3,18% 3,55% 5,99% 1,55%
2017 3,19% 3,54% 5,79% 1,55%
2018 (1st semester) 3,32% 3,66% 5,78% 1,55%
Target 2,00% 2,00% 2,00% 2,00%

2014 2015 2016 2017 2018 (1st semester) Target

Figure 31. FP7 residual error rate in 2014-2018.


Source: Compiled by PPMI based on the AARs of REA.

H2020

The Financial statement accompanying the Commission's proposal to the legislative


authority for the Horizon 2020 Regulation stated: “The Commission considers therefore
that, for research spending under Horizon 2020, a risk of error, on an annual basis,

46
REA 2017 Annual Activity Report.
54
within a range of 2-5 % is a realistic objective taking into account the costs of
controls, the simplification measures proposed to reduce the complexity of rules and the
related inherent risk associated to the reimbursement of costs of the research project.
The ultimate aim for the residual level of error at the closure of the programmes after
the financial impact of all audits, correction and recovery measures will have been taken
into account is to achieve a level as close as possible to 2 %.”

Based on this, the overall target for the residual error rate is aimed at being as close as
possible to 2 % (within the range of 2-5 %) for all H2020 programmes implemented by
REA with the exception of MSCA, where a specific target of below 2 % is set.

According to REA’s 2017 AAR, 110 of the 142 audits of the first Common Representative
Sample (CRS) for H2020 were closed at 31 January 2018. Given the partial closure of the
sample, the 2017 AAR referred to the residual error rate estimated by also including the
draft audit results of the 32 ongoing audits, which was 2.24 % (for the Research family
as a whole). The programme is multi-annual and so are the error rates; as the nature of
expenditure in the first years of the programme might not be totally representative of the
expenditure across the whole programme, therefore the residual error rate should be
considered over time.

Concerning MSCA, there were not enough audits to draw any conclusions for H2020.
Nevertheless, based on historical figures for FP7, first results of MSCA audits and
following new simplified rules under H2020, it was expected that the error rate would
remain below 2 %.

Cost efficiency of the management and control arrangements

Cost efficiency of the management and control arrangements could be expressed as a


ratio between the administrative budget of the Agency and the operational budget
managed by it. Similar to the previous evaluation period, during the 2015-2017 period,
REA proved to be an efficient and cost-effective structure for the management of the
delegated programmes. Its administrative budget (excluding the share of costs of the
central administrative and logistical support services provided for the Research Family
and other services) was below 3 % of the operational budget based both on commitment
and payment appropriations; generally, the cost-effectiveness of REA grew over 2010-
2017 (refer to Figure 32). Although REA’s management cost did not include additional
costs for policy development and supervision by the Commission services, the
management costs for programme implementation tasks of less than 3 % remained well
below the maximum ceiling of 6 % provided for Commission’s administrative expenditure
in the legal basis for FP747 and 5 % – for H2020.

Commitment appropriations Payment appropriations

6,0% 5,0% 4,7%


4,1% 3,7% 3,8% 3,6% 3,9%
3,5%
2,9% 3,0% 2,6% 3,3% 3,6% 3,6% 3,3% 3,4% 3,3%
4,0% 2,6%
4,4%
2,0% 3,6% 3,3%
3,1% 2,9% 3,0% 2,8% 2,6% 2,9%
2,5% 2,7% 2,3% 2,1% 2,6% 2,5% 2,4%
0,0%
2009 2010 2011 2012 2013 2014 2015 2016 2017 2009 2010 2011 2012 2013 2014 2015 2016 2017

Including support services Excluding support services

Figure 32. Ratio between REA’s administrative and operational budget.


Source: compiled by PPMI based on the AARs of REA.

47
People, Capacities and Cooperation specific programmes.
55
To what extent has REA led to an improved management of the programmes in
terms of simplification of the procedures and flexibility in the implementation of
delegated tasks?

We understand simplification as the introduction of new and better management and


implementation arrangements: simplification, streamlining and harmonisation of funding
rules and procedures across different programmes and programme strands; wider use of
systems for electronic data management and electronic data exchange between the
administration and beneficiaries, as well as user-friendliness of the employed IT systems;
simpler forms of grants (such as the use of the standard cost options – lump sums,
standard scales of unit costs, flat-rate financing), and other forms of simplification. A
wider use of well-tailored simplified implementation mechanisms and procedures should
contribute to the reduction of the administrative burden to both the beneficiaries and
REA, shifting the focus from project inputs to their results and impact of the EU
programmes, as well as achieving additional efficiency gains and enhancing the capacity
to adapt to periods of high workload. Furthermore, decreasing complexity of the
requirements should also contribute to reducing the error rate and increasing regularity
of the programme expenditure.

The previous evaluation study (2012-2015) of REA48 concluded that REA was
continuously fine-tuning its internal procedures and predefined practices and updating
them through the simplification measures. These simplification measures contributed to
ensuring that activities were executed in an increasingly efficient manner. During the
evaluation period of this evaluation REA, in cooperation with the Commission, continued
the optimisation of its procedures and programme implementation functions and
introduced a number of further simplifications some of which are presented below. REA’s
Networks played an important role in the optimisation and simplification process (refer to
an in-depth study area covering REA’s success stories and lessons learned in Annex 5 for
more details). However, the Agency could be further streamlined across other
programmes and Executive Agencies.

Remote evaluation of proposals

Because of the steadily rising numbers of proposals submitted, REA introduced fully
remote evaluation procedures for some calls with high numbers of proposals (e.g. FET-
Open in 2014). In the past only the individual reading of proposals by experts was done
remotely, the simplification measures, however, aimed at moving towards a fully remote
evaluation process (including the consensus phase) for MSCA ITN and IF calls. Only the
central panel meetings would still be held in Brussels with a limited number of experts.

Regarding MSCA, REA achieved its goal in 2016 as IF and ITN calls were all evaluated
remotely. Fully remote evaluations were also transposed to other programmes and
actions.

As a result of remote evaluations, travel and accommodation costs as well as daily


allowances for experts could be reduced and the office space used for evaluations could
remain stable independent of the number of proposals evaluated. In addition, REA could
use a wider pool of experts, including those with very specific expertise and profiles, who
had difficulties coming to Brussels for longer periods during the central consensus
meetings. Consequently, not only were the cost efficiency and economy of the process
improved but the quality of the evaluation process also improved 49.

Remodelling of the evaluation building

48
PPMI (2016). Evaluation of the Operation of REA (2012-2015). Final report.
49
REA 2016 Annual Activity Report.
56
During the first semester of 2018, REA dedicated significant efforts to remodelling the
evaluation building50. This involved remodelling floors 1-6 (walling, installations). By the
end of June 2018, the Agency was planning to partially equip these floors with new
furniture. The seventh floor has been fully available to ERCEA since the beginning of
2018. The work on COVE fourth floor was ongoing as reported by REA’s 2018 Interim
Report. Two audioconference and two videoconference rooms were fully operational by
the end of June 2018; in addition REA was planning to equip 13 new audio or
videoconference rooms (on the third, fifth and sixth floors) by September 2018.

Improved procedures for allocating proposals to the most suitable experts

REA has been using an IT tool (PEAS allocation module in SEP) since 2015, which allows
an automated pre-allocation of proposals to experts, based on the best match of
keywords related to the scientific field of the proposal and the scientific profile of the
expert evaluators. By offering a tailor-made set of keywords (rather than the standard
set of keywords established for H2020 as a whole), the automatic matching algorithm
performs better in terms of speed and quality. The pre-allocation is checked by REA staff
and can be updated manually.

This new allocation system allowed REA to improve the quality of allocation of proposals
to experts while making this process highly efficient, especially for calls with a high
number of submissions and in the bottom-up programmes51.

Measures to automate and improve the detection of possible conflict of interest

Seeking to safeguard the quality and efficiency of the evaluations, REA was taking
specific measures to automate and improve the detection of possible conflicts of interest
for experts. Following proposal submission and before the start of evaluations, REA’s
operational units checked for possible conflicts of interest among the selected experts in
relation to the proposals to be evaluated. This check complemented the self-declaration
of each expert on the absence of conflicts of interest. In addition, conflicts of interest
continue to be checked throughout the entire evaluation process

The new IT tool (ARIS) aims to provide automatic and streamlined detection, whereby
text-mining is applied to the list of participants submitting proposals and to the pool of
experts linked to a certain call. This would allow potential links to be identified, thereby
pointing to a possible conflict of interest. The new process, once fully operational, would
be faster, more effective, more consistent and less prone to error compared to manual
cross-checking52.

Electronic workflows and wider use of IT tools

Under FP7, a considerable number of documents had to be printed and physically


circulated from one office to the other. This had a serious impact on the whole workflow
and especially on the TTP since all revised documents had to be re-submitted as
originals. The FP7 2013 Space call grant agreement preparation workflow was selected to
serve as a pilot for the development and test-use of the H2020 IT tools
(Sygma/COMPASS, parts of the Participant Portal). The pilot prepared and anticipated the
H2020 paperless interactions among various REA actors and with beneficiaries,
introduced automatic registration of documents and transactions, but did not yet foresee
an electronic signature. Electronic pilot-workflows for payments, amendments etc.
followed. Based on this experience, electronic workflow tools were generalised to all the
Research family under H2020 and the electronic signature was introduced, reaping the
full benefits of automation. The use of the electronic workflow increased substantially in
2015 with the growing number of H2020 paperless files. With the electronic workflow,

50
REA 2018 Interim Report, p. 81.
51
REA 2016 Annual Activity Report.
52
REA 2017 Annual Activity Report.
57
the time gain was significant because the files immediately go to the next stage without
delays or human manipulation53.

Desk research, the beneficiaries’ survey and the interview programme confirmed that the
development of the IT tools and electronic workflows contributed to an improved delivery
of programme implementation functions and a growing level of satisfaction among
beneficiaries with most grant management processes. On the other hand, the
beneficiaries’ survey revealed that the level of satisfaction related to the user-friendliness
of the IT tools employed for the application and grant management were relatively low;
this level of satisfaction has not improved compared to the previous evaluation period.
Thus, further efforts are needed to improve the user-friendliness and user-experience of
the corporate IT tools related to programme implementation (which are not under REA’s
control).

Efficiency of internal organisation and procedures

REA’s internal organisation and procedures were conducive to the specificity of the tasks
delegated to the Agency during the evaluation period. REA was not only responsive to
the specific needs of the delegated programmes; it also accommodated the key changes
induced by the extended mandate and the emerging needs of its staff. Following the
review of its internal organisation and procedures, the Agency increased its operational
efficiency and optimised its day-to-day procedures by restructuring its Department
responsible for horizontal tasks and central support services, organising an annual
workload assessment and launching other measures.

To what extent have REA's internal organisation and procedures been conducive
to its efficiency?

‒ To what extent is REA's internal organisation capable and flexible to


rapidly respond to resource needs due to uncertainties related to volumes
of work?
‒ Is the size of the organisation as a whole and the different units
balanced/adequate/fit for purpose?
‒ Is the ratio of administrative vs operational staff fit for purpose?

Overall, the size and structure of REA, and the level of resourcing in each
department, were appropriate to its mandate and delegated tasks during the
evaluation period. The organisational structure of REA was based on separate
organisational units for different programmes and actions grouped into Department A
‘Excellent Science’54 and Department B ‘Industrial Leadership and Societal Challenges’ 55
according to the respective pillars of the Horizon 2020. The units, responsible for the
Agency’s horizontal tasks (Administration and Finance) and central support services
(Participant Validation and Expert Management and Support unit) were organised into the
Administration, Finance and Support Services Department. REA’s organisational structure
was reviewed and effectively aligned to cope with changes, such as the evolution of the
REA mandate. Two units in the Department C underwent restructuring as discussed in
more detail in the assessment of the Agency’s key success stories and lessons learned in
Annex 5.

One of the key challenges for the Agency during the evaluation period related to the fact
that the actual workload of REA was higher than estimated in the 2013 CBA. As

53
REA 2015 Annual Activity Report.
54
Department A was responsible for the implementation of Marie Skłodowska-Curie and FET-Open
programmes.
55
Department B was responsible for the implementation of Space Research, Sustainable Resources for Food
Security and Growth, Inclusive, Innovative and Reflective Societies, Safeguarding Secure Society, Spreading
Excellence, Widening participation, Science with and for Society programmes.
58
explained in section 4.2.4 many factors, which were beyond the Agency’s control,
influenced its workload, such as the average grant size and the number of proposals and
grants. Nevertheless, around 65 % of the staff reported that their workload was
acceptable, which was higher than the average of the Commission and other Executive
Agencies (59 %) in 2016. In addition, the type of work was an important factor for a
lower rate of staff well-being at work for some Function Groups. For instance, repetitive
work has also been one of the factors which lowered staff well-being at work for Function
Group II agents56. Their long-term sick leaves increased to about 8 % compared to
approximately 5 % for all the other function groups57.

The Agency took a series of steps to cope with the varying levels of workload and other
challenges affecting staff well-being. As part of adjusting its recruitment strategy, REA
formulated its learning and development needs on the basis of the relevant policy
objectives and identified the staff profiles necessary to achieve them. The Agency also
introduced annual workload measurement, which facilitated reallocations of staff
between different entities of the organisation. Although the extent to which certain units
and staff groups were able to take advantage of these staff reallocations differed, it
effectively contributed to the Agency’s efforts to cope with the varying levels of workload.
As a result of the actions taken, no cases of business continuity challenges were
mentioned during the interview programme. However, no specific measures have been
adopted by REA or the Commission to address the issue of - repetitive work during the
evaluation period.

Our interviews suggest that its HR planning and staff recruitment processes ran
smoothly. With the active participation of operational entities in the recruitment process,
REA effectively matched the Agency staff composition in terms of project, financial
and contract management with the diversity of the tasks related to the specific
programmes and the emerging needs in the face of inherent challenges. Based on the
Commission’s job screening exercise for 2016 (Sysper data), out of 739 staff, the Agency
employed about 9.5 % of administrative support and coordination staff, 25 % of neutral
staff58 and 65.6 % of operational staff; these results were nearly equal to the averages of
all Executive Agencies across all three groups 59. Around 75 % of REA staff reported that
their skills matched their current job, a value similar to the averages of the Commission
(77 %) and the Executive Agencies (74 %) based on the 2016 European Commission
Staff Survey. The feedback from the interviews with REA staff revealed that the
recruitment of scientific profiles with a strong expertise in project management has been
improved significantly since the opening of the permanent call for expression of interest
in EPSO. The remaining challenges to HR planning and recruitment were linked to the
shortened length of the recruitment procedure, the limited resources available for the
implementation of the recruitment procedure, and the technical issues which
continuously affect it.

To what extent does REA’s human resource management contribute to the


achievement of the Agency’s objectives?
Sub-questions:

‒ Are the staff turnover and vacancy rate well managed?


‒ How are provisions on overtime and teleworking implemented in the
Agency?
‒ How does the Agency follow up on the findings of the latest staff survey
(2016)?

56
Contract agents with Function Group II are typically engaged in clerical or secretarial tasks, office
management and other equivalent tasks.
57
Based on the interview data.
58
Staff responsible mainly for financial transactions, audits and other types of support.
59
Based on Sysper data provided by DG HR in December 2018.
59
Overall assessment of REA’s HR operations

Overall, the assessment of REA’s HR operations suggests that the Agency was responsive
to the results of the 2016 Commission staff satisfaction survey and the subsequent IAS
audit. REA has effectively linked its multi-annual HR objectives with strategic objectives
through the adoption of a comprehensive HR strategy as recommended by the IAS audit.
Thus, it has increasingly enabled the Agency to demonstrate a high level of effectiveness
and flexibility in the delivery of its key services in the context of the complex policy
framework within which REA operates. The Agency also made considerable efforts to
improve its HR operations by making use of various monitoring and reporting measures.

Human resource management, staff retention and turnover

By the end of 2015, REA had 608 staff, by the end of 2018, however, the number of staff
had risen to 73560. The Agency maintained the vacancy rate of around 2 % on average
between 2015 and 201861. REA’s capacity to effectively recruit new staff depended
largely on its strategic internal decisions as well as external factors. For instance, REA
simplified its recruitment procedure by putting in place smaller selection panels for
contract agents at higher frequency in 2017 62. This process proved to be more time
consuming, however, compared to the previous procedure and some technical issues
affected it. With the launch of the EPSO database for permanent CASTs and with the
extension of the inter-Agency job market to contractual agents, REA’s capacity to recruit
new staff was improved63.

Staff turnover during the evaluation period at REA decreased from 5.7 % in 2015 to
3.1 % in 2018 and stood at 4 % on average during the evaluation period64. This mostly
related to opportunities in other EU institutions offering higher grade positions
particularly after the opening of the Inter-Agency Job Market (IAJM).

The assessment of REA’s key HR indicators and a series of follow-up actions


undertaken by the Agency

The 2016 European Commission Staff Survey 65 presented the assessment of REA’s HR
indicators, highlighting its strengths and weaknesses in this area compared to the
averages of all Commission services and other Executive Agencies. Overall, the Agency
demonstrated consistently high results compared to the averages of the Commission and
the Executive Agencies in terms of the three key indicators, i.e. staff engagement index,
overall job satisfaction and well-being. REA ranked fourth in terms of overall job
satisfaction and fifth in terms of staff valuing the Agency as a modern and
attractive workplace among all the Commission services and the Executive Agencies in
2016. The Agency also demonstrated positive results across some specific areas of REA
staff experience. Some notable examples of where the Agency stood above the averages
of the Commission and other Executive Agencies included the staff willingness to
make an extra effort when required, their understanding of the Agency purpose
and job clarity.

The 2016 European Commission Staff Survey showed, however, that REA staff
satisfaction was less positive in relation to career development opportunities and
mobility, work recognition and equal opportunities as well as internal

60
The data on the snapshot of REA’s personnel actually employed as of 31 December of the reporting year
provided by the Agency at the end of January 2019. These data do not necessarily constitute full-time-
equivalents throughout the year. Based on REA’s administrative data (as of 3 January 2019).
61
REA 2015-2017 Annual Activity Reports and the administrative data on the total staff number at the end of
2018 provided by REA in January 2019.
62
REA 2016 Annual Activity Report, p. 84.
63
REA 2017 Annual Activity Report, p. 78.
64
Based on REA’s administrative data on staff turnover as of 31 December of the reporting year provided by
the Agency at the end of January 2019.
65
EC (2016). 2016 European Commission Staff Survey: Analysis of the findings.
60
communication. Some relevant examples of the specific areas of career development
opportunities and mobility where the Agency staff were least positive related to
performance and career progression, opportunities to move to another job matching their
skills, management of career choices and staff mobility. Although REA staff satisfaction in
these areas was rated least favourably compared to other HR indicators, they were still
somewhat higher than or equal to the levels of agreement for the Commission and other
Executive Agencies. With respect to work recognition and equal opportunities, the Agency
staff was less positive about the regular review of their progress and the provision of
clear feedback on their work, the visibility provided to their work by their direct manager
and their work recognition and praise for good work. As regards the level of satisfaction
with internal communication, the specific areas rated less favourably were the motivation
from the direct manager to be more effective in their job, the effective delegation of
tasks and responsibilities by their direct manager, as well as the way their direct
manager dealt with poor performance in the team.

In response to the results of the 2016 Commission staff satisfaction survey, REA mapped
its own strengths and weaknesses, examined the overall assessment and presented the
results across all of its units. Based on the results, which emerged from the discussions
with REA’s units, three focus groups were set up to address concerns raised on mobility
and career opportunities, equal opportunities and work recognition, and internal
communication. Between January and March 2017, each focus group prepared a report
for REA management setting five priorities and specific measures and targets to be used
to achieve them66. These priorities and measures were considered by the management in
May 2017 during the Management Away Day resulting in a selection of proposed
actions with high impact and achievability. Based on that and with the close
involvement of staff, REA adopted an action plan in response to the results of the 2016
Commission staff satisfaction survey. The plan focuses on improving staff engagement
and well-being especially in relation to career opportunities, staff recognition, training
and equal opportunities, and internal communication67. Most of the actions68 were
implemented as from the second semester of 2017. The action plan also set out KPIs to
be monitored as well as the timeframe. This allowed the Agency to effectively track the
progress with the implementation of the action plan through the AWPs and AARs.

The Agency was responsive to the findings of the IAS audit69 performed at the end of
2016 which recognised a number of issues. In accordance with these results and
benefiting from the new General Implementing Provisions (GIPs) 70 concerning the
conditions of employment of contract staff, the Agency updated and revised its
selection procedures by clearly defining tasks and responsibilities regarding the checks
to be performed71. This allowed REA to issue employment contracts more quickly. Some
interviews with the Agency staff during the interview programme in August 2018 raised
questions, however, about the extent to which the revisions made to the selection
procedure for CAs have effectively helped advance the recruitment process. For instance,
although the new selection procedures conducted through a central market allowed CAs
employed at REA to compete for higher-function groups, some of the Agency staff were
concerned about the possibility of losing their positions at the Agency to the external
candidates. In response to the IAS audit results, the Agency also modified the rules on
documentation of the selection process along with which REA also prepared guidance
for the procedure and supporting documents such as checklists and training material.
Procedures have been adopted to ensure that all panel members follow training. In

66
The 2016 Commission Staff Satisfaction survey observed other issues for instance in relation to middle
management, workload and IT tools. However, these issues have not been considered by the focus groups due
to the ongoing actions in the area of middle management and workload planned after the 2014 Commission
Staff Satisfaction Survey as well as the fact that the transitional period from FP7 to H2020 affected the use of
IT tools.
67
REA 2016 Annual Activity Report, p. 84.
68
REA 2017 Annual Activity Report, p. 79.
69
IAS (2016). Audit on Human Resources Management in REA. Final Report.
70
The new GIPs entered into force on 1st January 2018.
71
REA 2017 Annual Activity Report, p. 65.
61
addition, the HR unit of REA developed a guide to inform panel members about the rules
and obligations governing their work.

Implementation of a new HR strategy

In February 2017, the Agency introduced its new HR strategy with the objective of
providing a clear understanding of the policy framework within which REA strives to
implement sustainable and consistent human resource management practices, to recruit,
develop and retain the best staff72. In addition, the new HR strategy aimed to reinforce
the implementation of the 2016 Action Plan adopted in response to the 2016 Commission
staff satisfaction survey results, and respond to the 2016 IAS audit which recommended
REA to complement its HR strategy with a clear link with its strategic multi-annual
objectives and an outline of the indicators, periodicity of monitoring, as well as
the annual yearly performance indicators for HR in its AWP. The main building
blocks and areas of focus of the Agency’s new HR strategy are summarised as follows73:

1. HR forward planning and workload assessment;

2. the New Management Mode;

3. adaptation of the Commission policies to the REA context, permanent cooperation and
exchange of best practices with other Executive Agencies placing REA on a permanent
track for improvement;

4. endorsement of the approaches such as the whistleblowing procedure and anti-fraud


strategy to mitigate possible risks of breaches;

5. compliance with the IAS recommendation to review and revise as appropriate the HR
strategy-associated indicators on a yearly basis, and monitor them on a bi-annual
basis;

6. and finally, the 2016 Commission Staff Satisfaction Survey results.

REA designed and implemented an adequate HR management process to deploy a


competent, knowledgeable and engaged work force in order to deliver its priorities and
core business. Its HR management process was based on four main pillars:
recruitment, learning and development, well-being as well as internal
communication and staff engagement. To achieve progress in each of these areas,
the new strategy outlined the specific objectives, targets and timeframe in each of these
areas. Although detailed objectives, targets and timeframe were established for various
HR operations across four priority areas, the evaluation team could not fully assess its
implementation due to the lack of data (in particular the pending outcome of the 2018
staff satisfaction survey) in certain areas. For the purposes of the Agency’s new HR
strategy assessment, we analysed the effects of its design in the main areas of concern
(i.e. Career Opportunities, Recognition, Training and Equal Opportunities, and Internal
Communication) as highlighted by the 2016 Commission Staff Satisfaction Survey.

To address the main challenges in relation to career opportunities for its staff and in
line with the action plan, REA dropped the internal rule obliging staff to move out of the
unit after moving to a higher-function group. In addition, the Agency’s management
started offering additional responsibilities to volunteering staff independently of the
grade. At the end of 2017, around 85 % of the job descriptions were harmonised and
published on Sysper. The Agency was in the process of finalising the remaining 15 %
during the first semester of 2018. Towards the end of the first semester of 2018, REA
also launched the Inter-Agency Job Shadowing programme that paired REA staff with
colleagues from EASME, EACEA and DG GROW for a job shadowing experience (16 REA

72
REA (2017). HR Strategy, p.1.
73
Ibid, p. 1-2.
62
staff members have been hosted by EASME and EACEA). Structural challenges that
continued to affect REA staff mobility opportunities during the evaluation period included
the specific requirements for the share between CAs and TAs set to 75 % and 25 %
respectively and for the senior management positions filled with seconded officials from
the Commission (which implied that these positions were in most cases beyond REA staff
reach), as well as the changing of the pension age74 introduced by the 2014 Staff
Regulation.

In line with its action plan, REA also devoted significant efforts to improve work
recognition, training and equal opportunities. For instance, the Heads of Sectors
and Deputy Heads of Units’ peer learning network was launched, and the Agency started
encouraging management to commit to providing regular informal feedback to its staff.
In line with its HR strategy, REA has also been developing a competency framework to
support REA staff in the management of their career choices, foster HR processes based
on skills and competencies and facilitate internal mobility through career paths based on
these key competencies. In 2017, REA launched a second phase of the project, which
provided, among other things, self-assessment tools to allow staff to test their mastery in
certain competencies. Along with these actions, the Agency has revised its policy for
internal mobility in October 2017 to further improve the transparency when advertising
job opportunities. The annual appraisal and reclassification exercises are organised for
TAs and CAs (e.g. in 2017, 28 CAs and 27 TAs were reclassified, and four officials
seconded to REA were promoted as a result) in line with REA’s HR strategy 75. In the first
semester of 2018, the Agency adopted a Learning and Development Strategy for 2018
and 2019 and launched a communication campaign to highlight to REA staff the purpose
and benefits of the competency framework for their careers. This included high level
induction sessions (competency labs) for REA staff to inform them on how to make best
use of this competency framework. By end 2018, a self-assessment tool for each main
job profile complemented the competency framework.

One of the key actions undertaken by the REA in an effort to improve internal
communication during 2015-2018 was the launch of participatory leadership groups for
managers76 in line with the actions foreseen following the previous 2014 Commission
Staff Satisfaction Survey. The Agency also actively promoted a participative management
orientation (e.g. through publication of REA Success Stories 77). REA’s corporate identity
was also strengthened through the launch of videos, the use of the REA slogan ‘We are
gREAt,’ restructuring of the Intranet and other measures. The Agency participated (with
observer status) in the Commission’s ‘Committee for prevention and protection at the
workplace’ representing all Executive Agencies which is active in preparing the
establishment of a similar joint consultative committee specific to the Executive Agencies
located in Brussels in 2018. All actions tailored to address specific staff concerns were
communicated to REA staff. While the Agency has been gradually implementing most of
these actions during the evaluation period, some other important actions are still in
progress, for instance the organisation of an ‘away day’ for newcomers.

To what extent have the actual costs (including cost of coordination and
monitoring) of REA corresponded to the estimates of the 2013 CBA? If not, what
are the reasons behind it?
To what extent have the management and execution of the programmes by REA
been cost-effective as compared to the alternative options?

74
The employees who were recruited before 2014 have been reluctant to move to another Agency as this is
considered as a new recruitment post 2014 with a less favourable regime to acquire pension rights (1.8 % of
pension rights by year or service rather than 1.9 % before).
75
REA 2017 Annual Activity Report, p, 7.
76
Managers adopted a Participatory Leadership Charter and signed off the REA 10 management commitments.
They give favour to participatory approaches for introducing changes or taking managerial decisions that have
an impact on the daily activities and/or conditions of staff.
77
REA Success Stories were the result of high-quality project evaluation, selection and communication
processes on which the Agency relied to ensure that the projects it funds have a real impact on tackling societal
challenges and on boosting European competitiveness.
63
To what extent have the actual benefits corresponded to the estimates of the
2013 Cost–benefit Analysis? If not, what are the reasons behind it?
To what extent has the establishment of REA resulted in savings to the EU
budget as compared to the alternative options (e.g. difference in costs between
the Commission option and the Agency option)?

The above set of interrelated questions is addressed and answered in the retrospective
CBA (Section 4.2). The main results related to the questions above and the overall cost
efficiency of the Agency in 2015-2018 are presented below:

 The overall actual costs of the Executive Agency scenario 78 constituted EUR 257.8
million in 2015-2018. In order to evaluate to what extent the actual costs have
corresponded to the initial Specific Financial Statement (SFS) estimates it is important
to follow the same assumptions that have led such SFS estimates. The SFS estimates
(EUR 264.8 million over 2015-2018) were based on the EU contribution, however
REA’s administrative budget also included EFTA/EEA and third country contributions
(EUR 11.5 million over 2015-2018) to manage additional operational budget.
Consequently, based on the EU contribution only, the actual costs of the
Executive Agency scenario constituted EUR 246.3 million, which means that
the actual savings amounted to EUR 18.4 million and accounted for 7 % of
the SFS estimates. Significant cost savings occurred in REA’s Title II “Infrastructure
and operating expenditure.” The costs in Title I “Staff related expenditure” were
higher than estimated in the SFS, which was related to higher average staff costs.
Higher staff expenditure may become an even more important issue in subsequent
years since the average staff cost estimates remain constant in the SFS for the 2014-
2020 period, while the actual average staff costs might rise further due to salary
indexation, promotions and increasing staff seniority.

 The costs of the Executive Agency scenario were much lower than the estimated costs
of the in-house scenario. In 2015-2018 the actual cost savings deriving from
cost difference of the Executive Agency scenario and the in-house scenario
constituted EUR 105 million (or 29 % of the estimated costs under the in-house
scenario).

 Comparing the savings initially estimated in the SFS with the actual savings from the
delegation of tasks to REA, we found that the actual savings during the 2015-2018
period were higher than the initial estimates (EUR 105 million compared to EUR 80
million under the SFS estimates). As forecasted in the SFS and the ex ante CBA,
savings of the Executive Agency scenario primarily resulted from a higher share of
lower cost for contract staff (CAs) employed within the Executive Agency and lower
overall number of staff.

4.1.3 Coherence

Internal coherence relates to how far REA’s activities across the different sub-
programmes within Horizon 2020 work together consistently, taking into account REA’s
regulatory framework. A further aspect of coherence is the extent to which there is a
clear and appropriate delimitation of responsibilities and tasks between REA and Parent
DGs, and whether there are any overlaps and gaps. External coherence considers
issues such as the coherence of governance arrangements (including control/supervision
arrangements by the parent DGs) and the extent to which the implementation of tasks
under REA’s delegated programme implementation remit has enabled the parent DGs to
better focus on policymaking tasks.

78
Including cost of coordination and monitoring by the Commission and costs of REA covered from EEA/EFTA
and third country contributions.
64
Coherence was assessed through a desk research-based assessment of REA’s extended
mandate. A comparison was then made between how these tasks were implemented in
the 2015-2018 period with the 2012-2015 period of operations. Documents examined
included inter alia, REA’s Delegation Act and the MoU with the parent DGs, the external
communication work plan for REA, documentation relating to guidance on REA’s
structured policy feedback mechanism, as well as reporting and monitoring information
(AAR, budgetary information etc.). In addition, interview feedback fed into different
aspects of the assessment of coherence, such as the views of REA management and staff
on the overall coherence of REA’s delegated programme implementation mandate, and of
policy officials as to how well knowledge management by the Agency and feedback to
policymakers was working. The findings from Survey C (survey of EC officials) have also
been taken into account.

To what extent have there been overlaps/gap/inconsistencies and


complementarities within the programme portfolio and support services
managed by REA and how are these addressed?
Is there a clear and appropriate delimitation of responsibilities and tasks
between REA and the Parent DGs? Are there overlaps or gaps? Are the different
responsibilities adequately communicated to the beneficiaries?

No evidence of overlaps, duplication, gaps and inconsistencies were identified. Despite


the wide breadth of its responsibilities, the programme portfolio was found to be broadly
coherent and consistent, albeit thematically diverse. This has at least partly been avoided
thanks to close cooperation between REA and the parent DGs, and more broadly, across
the research and innovation family at the level of programme planning and
implementation.

The extent to which REA’s programme portfolio is coherent in the current 2014-2020
programming period (including legacy aspects of FP7) was analysed. The analysis took
into account the evolution in REA’s responsibilities between programming periods, e.g.
FP7 (2007-2013) and Horizon 2020 (2014-2020), and the extension of its delegated
implementation responsibilities to include further sub-programmes.

The desk research identified strong coherence between FP7 and H2020, because of
strong continuity in some programming responsibilities (albeit with an extended
mandate). Many programme implementation responsibilities delegated to REA are
coherent in that many (but not all) sub-programmes being implemented by REA in 2014-
2020 follow on from FP7. For example, REA was responsible for the delegated
implementation of MSCA under the Excellent Science pillar in both periods, and for most
activities within FP7 Security79 and the successor Secure Societies programme in H2020.

However, REA’s responsibilities were also expanded in H2020 to include additional


objectives and sub-programmes, such as the Future and Emerging Technologies (FET-
Open) and most activities under the objective “Food security, sustainable agriculture and
forestry, marine and maritime and inland water research, and the bio-economy.” In
addition, REA also became responsible for implementing specific objectives within H2020,
such as “Europe in a changing world – Inclusive, innovative and reflective societies.” In
addition, the sub-programmes ‘Spreading excellence and widening participation’ (SEWP)
and ‘Science with, and for society’ were also allocated to REA (SWafS) but were
previously managed in-house by the EC. REA’s management confirmed through the
interview programme that overall, while its programme portfolio is thematically diverse,
it is coherent and consistent.

79
Specifically, in respect of the transition from FP7 Security to H2020 Secure Societies, a key difference is that
REA has recently been given responsibility for all projects, including those producing classified materials,
whereas previously the EC retained responsibility under FP7 Security for larger-scale demonstration projects
and for classified projects.
65
REA’s programme implementation responsibilities also demonstrate coherence because
they fall within a single specific programme, Horizon 2020. While REA is
responsible for implementing several different parts of H2020 80 for multiple DGs and
across a broad thematic spectrum, the responsibilities are broadly coherent in that they
are all part of a single programme, with common rules and procedures. Moreover, the
business processes for different sub-programmes are consistent, which means that the
overall grant management process is coherent, even if the thematic areas of intervention
differ. It was noted, however, in the interview programme, that there are some
specificities within the Marie Skłodowska-Curie Actions (MSCA), corresponding to the
different types of activities supported, compared to the rest of H2020.

The overall findings relating to the coherence of administrative and logistical


support services were that the portfolio of common administrative and logistical
support services provided by REA to the Research family was coherent. The centralisation
of such support services was necessary, since it avoided having to set up such services
for each and every sub-programme within H2020. However, the delegation of
responsibilities for legal and financial validation for participants under all direct
management of the Commission in the context of SEDIA (i.e. beyond H2020 applicants)
was only implemented recently. It is therefore too early to assess its effectiveness.

The Commission services delegated the management of administrative and


logistical support services on behalf of the Research family to REA. The
Delegation Act sets out REA’s role in providing common support services. This consists of
a number of tasks, namely:

 Call planning (and publication);

 Contracting and payment of expert evaluators and related activities;

 Logistical support for evaluating applications;

 Validation of the existence and status of legal entities (beneficiaries), the Legal Entity
Authorised Representative (LEAR), and the financial capacity assessment (FCA);

 Managing the Research Enquiry Service (RES).

The scope of these services is set out in the ‘Rules for the administrative and logistical
support services provided by REA for H2020 and certain other programmes.’ The
rationale for making REA responsible for the above-mentioned tasks at the time of REA’s
first mandate was that the FP7/H2020 programmes require significant organisational
management given the scale (involving six DGs, four Executive Agencies and nine joint
undertakings). Arrangements for the common provision of administrative and logistical
support services were therefore seen as necessary within one of the Executive Agencies
involved in FP7/H2020 implementation. REA was viewed by DGs across the Research
family as being the most logical EA in which common support services should be housed.

In 2017, REA was also entrusted with the legal validation of third parties and the
preparation of financial capacity assessments for grants awarded and procurements
taking place under all direct management of the EC, i.e. beyond the three pillars of
H2020. The CSC noted that there is a logic in REA performing these validations across
the EC, since the same standardised procedure applies irrespective of the direct
management programme. According to REA’s 2017 AAR, among the benefits of a more

80
This includes for instance, Part I ‘Excellent science’, which includes the implementing the Marie Skłodowska-
Curie actions (MSCA) and the FP7-People Programme, Part IIIa ‘Spreading excellence and widening
participation, and Part IIIb ‘Science with, and for society’.
66
centralised approach are contributing to synergies, economies of scale and simplification
for participants81.

Overall, REA delivered a high quality and effective service to its clients and other
stakeholders in the area of the newly delegated activities, while relying on a new legal
framework (refer to section 4.1.2 and in-depth study area for more details on the
management and provision of the central support services). Although the MoU, signed
between the six parent DGs and the Agency in 2016, did not take these changes into
account, three separate legal documents were adopted in 2018 to guide REA’s
operational work related to the new activities. Firstly, the GPSB endorsed the Rules on
Legal Entity Validation, LEAR appointment and Financial Capacity Assessment were
endorsed by GPSB in the context of applications for EU grants, tenders and prizes.
Secondly, the Rules for the provision of validation support services were set out. In
addition, the Agency established the participant validation business process. These
documents were particularly relevant for the Agency’s operational work related to
programme implementation, expert management and the provision of the validation
support services to EU bodies managing grant and procurement procedures in the
context of its extended mandate.

Regarding the scope of the support services other than legal and financial validation of
participants, it was suggested by some interviewees that it could be further expanded to
other DGs relating to EU programmes outside the RTD FPs in the future. However, there
was a concern among some interviewees as to whether it would be appropriate for REA
to serve EU institutions/programmes beyond its core remit of implementing the EU RTD
FPs. Insofar as REA provides some generic administrative and logistical support services
which are non-R&D specific, such as logistical support and/or expert contracting and
payments, this could be possible without undermining coherence. Nevertheless, there
may be other areas, such as operating the research help desk, which are specific to
having in-house knowledge about the detailed aspects of programme operations and
associated implementation rules. Thus, some interviewees thought that these areas could
be less easily replicated by REA for other EU programmes outside its core focus on
R&D&I.

Coherence of IT support

While REA appreciates the IT support that it has received from the CSC and DG DIGIT,
there were concerns within the Agency that there is a need to retain the capability to
customise IT tools to meet the differing requirements of its six parent DGs, whose needs
vary considerably.

The role of IT was analysed from the perspective of how far this has been supportive to
REA in implementing its core programme implementation remit, and in the provision of
common administrative and logistical support services. While some IT tools have been
developed for REA by the CSC, others such as the SEDIA (Single Electronic Data
Interchange Area) participant registration tool are still under development by DG DIGIT.
A further issue analysed was the coherence of IT support and whether it is coherent to
have different entities developing different IT tools on behalf of REA.

The Agency relied on different counterparts for the development of the IT tools. While the
CSC was responsible for the development of corporate IT tools for the Research family
overall, DG DIGIT’s role was confined to the development of SEDIA to provide legal and
financial validation services for successful applicants. According to the CSC, this
governance structure has specifically been designed in a way that ensures the
participation of all implementing bodies in the decision-making process, thereby
permitting the concerns of different implementing bodies to be officially addressed. The

81
REA 2017 Annual Activity Report, p. 27.
67
use of corporate IT tools already developed have led to efficiency savings, especially
through the transition to paperless processing and electronic reporting in H2020.

To ensure effective collaboration with the external IT providers, REA had to work closely
with the CSC and DG DIGIT. Some Agency staff expressed some concerns about the fact
that the IT providers they worked with were not always close enough to the business
processes REA was responsible for implementing. For instance, there was a growing need
to customise the IT tools to meet the different needs and expectations of different parent
DGs who worked with the different H2020 sub-programmes. However, the external IT
provider did not understand the external needs underlying these processes which
affected the Agency’s work. As a result, REA played a specific role in some parts of the IT
governance (i.e. BPO for proposal management, expert management and the participants
register).

The adequacy of the overall timings for the development of some of the IT tools (e.g.
OSIRIS) as well as technical support were also questioned by some REA staff. These
issues were primarily related to a number of processes which remained to be
implemented in the COMPASS/Sygma grants management tools as well as the IT tools
that were affected by the technical issues for several weeks at the time, because these
issues had an impact on REA’s productivity.

To what extent has REA enabled the Commission to better focus on its policy
related tasks?
Sub-question: does REA provide useful information on the implementation of
the delegated programmes and their progress (in terms of management and
content) in support of the policymaking process (e.g. information required for
the Annual Management Plan of the Parent DGs)?

In the area of knowledge exploitation and support to policymaking, REA is guided by the
CSC's Strategy for the common dissemination and exploitation of R&I data and
results (2018-2020). This was developed jointly between the CSC, Commission DGs
and the relevant Executive Agencies (including REA) involved in the implementation of
Horizon 2020. The strategy built on the previous strategy, the H2020 legal basis and the
proposal for Horizon Europe. Furthermore, the 2016 MoU was revised to put in place
coordination arrangements to facilitate regular, structured dialogue between REA and its
six parent DGs which established various general tasks and responsibilities for both sides
in the area of provision of information on the implementation of the delegated
programmes and their progress.

However, as discussed in section 4.1.1, REA (like the R&I family in general) lacked a
more operational policy feedback strategy as was recognised by the previous Evaluation
of REA (2012-2015). Thus, the Agency initiated the development of a range of strategic
and operational measures to strengthen its policy feedback mechanism. Various
mechanisms have been grouped together and organised into the following categories:
policy support measures, such as encouraging Open Science and long-term
sustainability of data and networking between projects (communication platform),
organisational measures, such as sharing folders or common IT platforms with
project-related information, communications-related measures, such as the
production of a newsletter to be shared with the parent DG and the creation of a
communication plan for the PF concerned, IT-related measures, such as tools to
facilitate IT analysis for programmes and the training of POs, policy-development
measures such as REA’s involvement in the preparation of policy reports, and proactive
management tools, such as organising thematic cluster events linked to policymaking.

Furthermore, following internal consultations, REA decided to take into account earlier
weaknesses in the quality and availability of policy-relevant feedback, in parallel with the
development of the Policy Feedback guidance document, and an innovative approach was
adopted whereby REA developed a Catalogue of Options for Policy Feedback. The
68
catalogue presented an extended inventory and typology of policy feedback outputs
produced by the Agency. In addition to regular meetings, the Agency organised cluster
meetings with the parent DGs and other stakeholders, held coordination meetings,
contributed to the preparation of policy reports, participated in thematic events organised
by the EC, contributed to WP implementation/planning, produced reports for Programme
Committees and/or Advisory Groups, produced inputs for EC communication and
dissemination events, collected feedback from the beneficiaries, and contributed to the
identification and publicising of success stories. REA was also involved in multiple other
policy feedback activities. For a more detailed analysis of policy feedback options outlined
in the Catalogue of Options for Policy Feedback, please refer to an in-depth study area on
REA’s coherence and maintenance of know-how within the Commission in Annex 5.

REA has enabled Commission officials to focus on their policymaking responsibilities by


fulfilling its responsibilities relating to implementing delegated parts of Horizon 2020
efficiently and effectively. While progress has been made by REA in strengthening policy
feedback generally, there was room for improvement by further customising support to
inform policymaking. Since different DGs had different policymaking needs, and a desire
for different types of policy-relevant outputs, the evaluation identified cases where the
feedback produced by REA was not used or had only partially been taken up. Recognising
the need to map and document various practices at the unit/parent DG level, REA started
to proactively engage with the parent DGs to formulate their policy feedback needs and
discuss which types of PF activities would be most useful to them as discussed in section
4.1.1. In this process, it remained important to take into account the resource
implications that could be sustained by the Agency in delivering such feedback.

The extent to which policy feedback outputs produced by the Agency in 2015-2018 have
enabled the Commission to better focus on their policy-related tasks was initially
analysed through interviews with EC officials and Agency staff. The evaluation findings
were then confirmed through a series of questions included in Survey C, with regard to
the types of REA policy feedback outputs provided by REA to EC officials, the degree of
satisfaction with their frequency, quality and uptake (see Table 7 below)82.
Table 7. Overview of the frequency, quality and uptake of various policy outputs
provided by REA to the parent DGs.

Type of policy Share of EC Satisfaction Satisfaction Type of Uptake of


output officials with the with the policy policy
receiving frequency of quality of the output feedback
specific the policy policy outputs
policy feedback feedback provided
feedback outputs outputs
outputs
provided by
REA

Cluster meetings 60 % 82 % 70 % 58 % 58 %
with parent DGs
and other
stakeholders (NCP,
etc.)

Coordination 67 % 73 % 76 % 70 % 70 %
meeting with
Project
Officer/Policy

82
At the time of the preparation of this report, however, the Agency was still in the process of defining and
implementing the business processes and policy feedback activities associated with various areas in which REA
collaborated with the parent DGs. Although some of the resulting actions were already taking place at the time
of the interview and survey programmes in August-October 2018, it should be noted that they may have not
necessarily shown their impact yet at the level of the policy officers. This may explain the lower satisfaction
expressed by EC officials during the survey and interview programmes.
69
Officer

Contributions from 38 % 70 % 76 % 65 % 65 %
REA to the
preparation of
policy reports

Participation of 56 % 80 % 84 % 64 % 64 %
REA in thematic
events (specific
policy related)
organised by the
EC

Written feedback 53 % 75 % 71 % 50 % 50 %
from REA to parent
DGs on the WP
implementation

Reporting to 69 % 84 % 87 % 64 % 64 %
Programme
Committee and/or
Advisory Groups

Project kick- 51 % 79 % 74 % 78 % 78 %
off/review meeting
with Policy Officers
attending

Innovation Radar 33 % 74 % 64 % 64 % 64 %

Input on REA 64 % 69 % 61 % 64 % 64 %
projects for EC
communication
and dissemination

Collecting and 22 % 70 % 66 % 67 % 67 %
giving feedback
about researchers’
needs and
satisfaction

Follow-up on 56 % 60 % 60 % 60 % 60 %
success stories

Source: survey of EC officials: ‘Beyond regular meetings, which types of policy feedback outputs did REA provide to you or your
unit at the Commission during 2015-2018’ (Yes, REA provided such policy feedback outputs to me or my unit at the
Commission during 2015-2018); ‘How satisfied were you with the frequency of REA’s policy feedback outputs produced during
2015-2018?’ (Very satisfied/Rather satisfied); ‘How satisfied were you with the quality of REA’s policy feedback outputs
produced during 2015-2018?’ (Very satisfied/Rather satisfied); ‘To what extent did you use these REA outputs to inform your
policymaking tasks?’ (To a large extent/To a moderate extent).

The survey of EC officials found that there was a rather high overall level of satisfaction
with the frequency and quality of the specific outputs produced. Based on the Survey C
results, the key policy feedback activities and outputs produced by REA for the
Commission included Reporting to Programme Committee and/or Advisory Group
(69 %), coordination meetings with Project Officer/Policy Officer (67 %) and
Inputs from REA projects for EC communication and dissemination (64 %). The
EC officials ranked the frequency (84 %) and quality (87 %) of Reporting to Programme
Committee and/or Advisory Group most favourably. However, only around 64 % of the
respondents indicated that they have used this policy feedback output to inform their
policymaking tasks. In terms of the coordination meetings with Project Officer/Policy
Officer, the levels of satisfaction among EC officials with the frequency (73 %) and
quality (76 %) of policy feedback outputs were also high. Around 70 % of EC officials
stated that they have used these policy feedback outputs to inform their policymaking
70
tasks. Regarding input from REA’s projects for EC communication and dissemination, the
respondents were satisfied with the frequency (69 %) but they were slightly less satisfied
with the quality (61 %) of policy feedback output provided. Around 64 % of respondents
indicated that they have used this policy feedback output to inform their policymaking
tasks.

According to EC officials surveyed, Collecting and giving feedback about


researchers’ needs and satisfaction (22 %), Innovation Radar (33 %) and
Contributions from REA to the preparation of policy reports (38 %) were the least
produced by REA. Nevertheless, among EC officials surveyed who stated that they have
received these policy feedback outputs, there were high levels of satisfaction with their
frequency and quality. Moreover, a significant share of these respondents stated that
they have used these policy feedback outputs to inform their policymaking tasks. For
instance, about 70 % of respondents were satisfied with the frequency of policy feedback
output and 66 % were positive about their quality. About 60 % of these respondents
revealed that they have used this policy feedback output to inform their policymaking
tasks. The results regarding the frequency and quality, as well as the utility of the
Innovation Radar and contributions from REA to the preparation of policy reports were
similar. In summary, EC officials who had received and were aware of these outputs and
different policy-relevant activities generally found them useful.

Selected examples of different ways in which the Agency’s policy feedback outputs have
informed the Commission’s policymaking tasks are:

 The identification of relevant projects in a given priority policy area that can be
considered as success stories and used for external communication purposes;

 Developing new schemes/grants, for instance REA data and analysis, which have
enabled DG EAC to develop the MSCA widening fellowships to widen participation;

 Informing the development of the future Framework Programme, e.g. REA provided a
reality check when assessing the feasibility of implementing new ideas;

 Definition of the number of projects/topics to be funded to improve the success rate;

 Successful project outputs, e.g. demonstrations in space to prove new technologies.

Regarding the extent to which the delegated management and implementation of its
tasks by REA has enabled the six parent DGs to better focus on their policymaking tasks,
examples of good practices identified were the following:

 REA includes a detailed overview of policy-relevant developments in its


Annual Activity Report. This reflects the fact that it implements Horizon 2020
projects across a broad policy spectrum. An entire chapter was dedicated, for
example, to H2020 implementation across different thematic policy areas in the 2017
AAR. The AAR reports from 2015 and 2016 also provide detailed feedback about the
Agency`s ongoing implementation. The implementation to the predecessor actions
under FP7, as well as the actions delegated under H2020 are covered in the AAR of
2015. The results of the implementation of MSCA, FP7, FET-Open are presented in the
AAR of 2016.

 REA has adopted a thematic clustering approach to analysing the policy


lessons and research results generated through groups of projects. Under this
new system, REA officials analyse and extract policy-relevant information of relevance
to particular DGs. It also organises workshops to discuss the policy lessons and
emerging issues relating to the policy relevance of research results from these
projects. This was viewed as having added value to the policy development process.

71
These results demonstrate that the extent of take-up of policy outputs has varied
considerably between DGs and types of policy feedback outputs produced.
Furthermore, the survey of EC officials revealed that some policy feedback outputs
produced by REA were rarely used, even if their overall quality was satisfactory. For
instance, about 53 % of EC officials indicated that they have frequently received written
feedback of high quality from REA on WP implementation. However, only half of these
respondents had used this policy feedback output to inform their policymaking tasks. This
suggests that the Agency was in some cases using its resources to produce policy
outputs that were not fully used by the EC. This was also confirmed by EC officials
interviewed from multiple parent DGs who confirmed that they have not always been
able to analyse the policy outputs provided to them by REA.

Overall, only 38 % of EC officials surveyed indicated that they were provided with
sufficient policy feedback to inform their policymaking tasks as mentioned in section
4.1.1. despite the steady stream of policy feedback and overall good quality of the
outputs produced. This stemmed from the fact that the needs of policymakers have not
yet been fully formulated in a clear and concrete manner by all parent DGs, which has
made it challenging for REA to address such concerns. The evaluation also identified
differences between ‘top-down’ and ‘bottom-up’ research programmes in this regard. In
the case of top-down research programmes, where policymakers develop the AWP, the
link to policy feedback activities was more direct compared with bottom-up research
programmes, such as the MSCA, where different types of feedback were required.
Furthermore, policy feedback appeared to have been better appreciated where REA has
been managing programme implementation for some time, whereas there were
sometimes gaps in expectations as to what it was feasible for the REA to provide within
its resource constraints among DGs where the Agency has only recently taken over
responsibility for particular programmes.

In order to improve the situation, REA and the parent DGs should actively contribute to
the ongoing dialogue with each parent DG (and in respect of each programme for which
REA is responsible for implementing) to agree the format in which policy outputs are
produced, and crucially, how these are disseminated to ensure that they reach the most
relevant policy officials that could potentially benefit from them. For more details on the
suggestions made by EC officials surveyed on how policy feedback outputs could be
further improved, refer to Annex 2.

Sub-question: are there any governance (financial and policy) issues in relation
to the implementation of the tasks that are delegated to the Executive
Agencies/ issues as regards the Commission services being able to steer EU
policy or budgetary implementation?

There are two distinctive areas of potential improvement with regards to the governance
issues relating to the implementation by REA of its tasks: the governance arrangements
relating to the provision of policy feedback, and the arrangements relevant to policy and
budgetary implementation.

Regarding the governance arrangements relating to the provision of policy feedback, it


was acknowledged by the stakeholders interviewed that governance arrangements
relating to REA are complex, reflecting the fact that the Agency has six parent DGs.
Indeed, among the Executive Agencies, only EASME has more parent DGs. While
governance arrangements generally appeared to be functioning coherently and
effectively, this nevertheless posed challenges for REA during the evaluation period in
being able to feedback policy-relevant information to address the policymaking needs of
six DGs across very different types of policy areas. Examples of ways in which REA
provided policy feedback were provided in the previous EQ (e.g. thematic clustering of
projects, coordination meetings with project officers, and the development of policy
briefings). However, there were also challenges in terms of standardising the provision of
policy feedback to the different parent DGs, given their differing needs. The research
72
found evidence that in order for REA to be able to meet the different expectations of
different DGs, each parent DG needs to make an effort to clearly formulate what types of
policy inputs would be most useful at each unit/policy DG level.

Regarding the mechanisms through which the DGs can steer EU policy and budgetary
implementation, policy DGs are responsible for drafting the H2020 Annual Work
Programmes and for formulating the text for the thematic calls means that there is ample
opportunity to shape the direction of the implementation of different thematic areas and
sub-programmes within Horizon 2020. Since the H2020 AWP are developed on a top-
down basis, the parent DGs are also able to strongly influence the types of research
projects that retain funding, and thereby to ensure that particular policy issues (including
societal challenges) are addressed. However, feedback was received that it is easier for
parent DGs responsible for top-down research programmes, such as those relating to the
societal challenges, to establish their policymaking needs compared with bottom-up
research programmes, such as the MSCA.

Looking at how far the Commission services are able to steer budgetary implementation,
the Agency submits its contributions to the annual budgetary procedure of the
Commission on the basis of the instructions provided by DG BUDG. REA’s Authorising
Officer (RAO) has responsibility for the Agency’s implementation of the delegated budget.
Political responsibility for scrutinising the budget then lies with the parent DGs
responsible for the different respective programmes.

The Research Budgetary Network (RBN), chaired by DG RTD (involving the participation
of other DGs) is a mechanism through which the Commission services can scrutinise the
Agency's operational budget. Further information about the role and modus operandi of
the RBN is provided in a subsequent EQ regarding the monitoring, reporting and
supervision arrangements in place to enable the Commission to benefit from the know-
how created within REA. However, it can be noted in responding to this EQ that this
mechanism oversees budgetary matters in relation to REA’s operational budget, and how
this corresponds to the budget allocated to particular sub-programmes in H2020. This
ensures that the parent DGs can monitor and steer budgetary implementation. The MoU
makes clear that the DGs are ultimately responsible for scrutinising programme
implementation from a budgetary perspective. The MoU states that each parent DG is
responsible for the parts that relate to their respective operational budgets delegated to
the Agency83.

Among the challenges identified in scrutinising budgetary data are the difficulty in
achieving a balance between allowing REA to get on with implementing the delegated
operational budget which underpins its core programme implementation mandate, while
at the same time ensuring effective scrutiny and avoiding micro-managing the
relationship between the Agency and parent DGs.

Are appropriate mechanisms and instruments in place, and at which level, to


ensure adequate coordination and information flows between REA and the
Commission services, notably on the content of the projects supported and their
results?

In 2015-2018, mechanisms and instruments have been put in place to strengthen


coordination and information flows between REA and its six parent DGs, thereby
contributing to improved knowledge management.

Overall, appropriate coordination and communication mechanisms have been put


in place to facilitate information flows between REA and the Commission services, in
particular with regard to the content of projects supported and their results. The parent

83
Memorandum of Understanding, p.7.
73
DGs stated throughout the interviews that they have received adequate information flows
about REA’s organisational performance in delivering on its delegated programme
management and implementation remit through the AAR. This provided the primary
reporting tool for communicating information to parent DGs and wider stakeholders about
key achievements pertaining to REA’s operational performance. As with other Executive
Agencies, a wide range of management information and KPIs was reported on, such as
Time-to-Grant, error rates, budget implementation, etc.

In addition, in relation to project content and research results, there were a number of
formal reporting mechanisms for information sharing, knowledge management and
the provision of policy feedback that were implemented by REA, such as: the
preparation of a draft action plan to implement recommendations made in the
previous evaluation, an initiative on Strengthening Policy Feedback in REA –
Mapping Insights and Recommendations, and the creation of the AGILE-Rapid
Reaction Network. Further initiatives included the development of the OSIRIS
information and data mining tool, and the examples of the organisation of
workshops and information sessions by REA on particular sub-programmes, in
conjunction with the relevant parent DG. Some examples of these formal initiatives were
assessed to the extent possible and their contribution to strengthening policymaking was
considered. It should be noted, however, that some instruments and tools have only
been introduced recently, and therefore some activities were more readily evaluable than
others. For example, the action plan to strengthen policy feedback prepared in early
2018 by REA was still being reviewed by the parent DGs at the time of the preparation of
this report. It should also be stressed that a more detailed description and assessment of
these initiatives is provided in an in-depth study area on knowledge management and
policy feedback in Annex 5.

REA has undertaken reflections as to how to strengthen policy feedback through the
development of a document on Strengthening Policy Feedback in REA Mapping
Insights and Recommendations. This initiative was developed by the Project
Monitoring and Policy Feedback Task Force (PFTF), which operated from 2016 to
2018. The PFTF was an important initiative by the Agency to improve policy feedback
during the current evaluation period. Such initiatives were found to have helped to
demonstrate that REA is committed organisationally to improving the provision of
coherent and effective policy feedback through improved knowledge management, and
by documenting good practices in the definition and implementation of policy options.
The follow-up of the work of the PTPF has been incorporated into an action plan. The
initiative was designed to follow up on the recommendations relating to improving policy
feedback raised in the evaluation of the 2012-2015 period. However, since the DGs were
still reviewing the feedback provided in this action plan at the time of the preparation of
this report, it was still too early to evaluate the effectiveness of follow-up actions.

Importantly, REA has adopted innovative ways of identifying policy-relevant research


results across groups of projects, in particular by fostering a thematic clustering
approach. Thematic cluster meetings have been designed as a means of discussing
research results across thematically linked projects so as to be able to derive policy
lessons. Interview feedback and the results of Survey C suggested that this initiative was
strongly welcomed by EC officials. At the same time, because of their formal nature,
some concerns were expressed by the interviewees that the cluster meetings could
become an ‘artificial’ reporting mechanism without a direct link to a strategy to achieve
more effective knowledge management. The challenge of ensuring that clusters can be
organised sufficiently quickly was also raised. For instance, EC officials noted that cluster
meetings (along with the preparation of the reports that were published following such
cluster meetings) should be organised with the most relevant policy DG(s) within 2-3
months of the request, particularly if it concerns a new emerging priority area. This has
required the Agency to develop the capacity to organise cluster meetings in a prompter
and more responsive way than was possible prior to this initiative.

74
A number of activities have also been introduced to facilitate knowledge-sharing and
information flows between REA and different Units within the European
Commission. One of the main mechanisms to provide Structured Policy Feedback was
the AGILE-Rapid Reaction Network. The network was designed to provide a flexible
tool through which REA could provide knowledge and information quickly in the form of a
factsheet overview to its parent DGs (e.g. when aiming to offer qualitative answers to
pressing policy questions from the parent/political DGs in a timely and efficient manner).
Further in-depth information about how this tool worked and its aims is provided in the
case study on knowledge management and policy feedback in Annex 5.

According to REA staff, ensuring the effective implementation of this network, in


particular the need to respond rapidly to requests made by policy DGs has however
proved both an organisational challenge (in terms of human resources and technical
capacity) and a practical challenge. A need for further capacity-building to strengthen
REA staff knowledge and understanding of the widely varying policy challenges faced by
different parent DGs with policy responsibility for thematically diverse programmes was
identified. It furthermore suggested that there is a need for Commission officials from
different parent DGs to decide what format policy outputs would be most useful in, over
and above the short policy factsheets produced by the Network upon request.

The interviews with the CSC revealed that REA has helped to develop a common
approach to exploitation and dissemination between the EC services and
Executive Agencies through its participation in the CSC’s Executive Committee.
However, some parent DGs reported the need to further improve the appropriateness of
the information shared to different policymakers and to agree the communication
channels through which this should be disseminated.

Furthermore, to strengthen access to information, the Agency is of the opinion that EC


officials in the parent DGs should have easy access to relevant databases, including
efficient access to a common database and access to better data mining tools. The
OSIRIS tool84 was still under development in the evaluation period. Once developed, this
should enable the Commission services to better exploit the relevant content of
operational files and to maximise the use of data for better policymaking in line with the
Commission strategy on data, information and knowledge management. It is therefore
too early to assess the utility of this tool.

Formal meetings also took place between REA and officials from the various parent DGs,
and between REA and the CSC through its participation in the CSC Executive committee,
and in a number of other Committees relating to business processes pertaining to the
grants management system during 2015-2018. The type and regularity of informal
information exchange differed depending on the parent DG concerned. For example, REA
representatives were frequently invited by DG CNECT to attend technical meetings and
preparatory meetings before the Programme Committee and Advisory Group meetings.
DG GROW’s representatives also confirmed that a variety of mechanisms have been used
to communicate with the Agency, in particular through regular scheduled meetings and
the electronic exchange of a significant amount of reporting information on the
implementation of programmes within H2020 for which GROW has ultimate policy
responsibility.

Informal mechanisms to strengthen information flows included day-to-day contact


between REA officials and officials from the various parent DGs. Another mechanism for
informal information sharing to improve REA’s external coherence was the practice of
inviting the Agency Director to attend Director-level meetings of the lead parent DG, DG
RTD. This practice was reported as an effective means of exchanging information about
progress in programme implementation and relevant policy feedback. According to

84
‘Strengthening Policy Feedback in REA: Mapping, Insights and Recommendations’ and REA 2017 Annual
Activity Report, p. 80
75
Commission officials participating in the interview programme, this mechanism was
compatible with maintaining a clear delimitation of roles between the Commission and
the Agency.

Officials from DG RTD and other parent DGs stated that REA already produced adequate
summaries of project results across groups of projects and has sought to identify lessons
learned in this area. However, a trend was noted among some officials who did not
believe that what has been produced was sufficiently tailored to meet their needs,
despite the evidence which points to a large variety of policy feedback outputs provided
by the Agency. A more detailed analysis of the provision of policy feedback is presented
under the previous EQ regarding the extent to which REA enables the Commission to
better focus on its policy-related tasks. This also raised an issue as to how far there
should be an onus on policy officials themselves to extract policy-relevant information if
it has already been made readily available and accessible, as opposed to this information
being required to be presented to them in a particular format by REA. There were
differences of opinion in this regard in the interview feedback. Whereas some officials
from parent DGs expressed the view that REA was not doing enough to extract policy-
relevant information and to draw this to the attention of parent DGs, others were of the
view that what REA already produced was more than adequate and any further
information provision could risk information overload.

Is there a clear and appropriate delimitation of responsibilities and tasks


between REA and the parent DGs? Are there any overlaps or gaps?
Are the different responsibilities adequately communicated to the beneficiaries?

The main finding in relation to this EQ was that the broad delineation of tasks is clear
from the REA’s regulatory framework. The division of responsibilities is reasonably
clear and was not perceived as being problematic.

However, a more general challenge for REA, in common with other Executive Agencies, is
where the balance should lie between performing delegated tasks relating to programme
implementation, such as programme administration and grants management, the core
business of the Executive Agencies, and tasks relating to promoting knowledge transfer
and to supporting the Commission services in their core policymaking role by providing
inputs to inform policymaking. While the EC services are responsible for policymaking,
the expectation of the parent DGs is that REA will extract policy-relevant information on
their behalf.

The tasks to be performed by REA follow the broad framework set out in Council
Regulation (EC) No 58/2003 of 19 December 2002. The specific delimitation of tasks is
set out in the Delegating Act85.The Commission Decision of 20.12.2013 sets out the tasks
delegated to the Agency relating to the implementation of specific parts of the Horizon
2020 programme and to the legacy parts of FP7. Article 5 sets out the Tasks reserved to
the Commission and makes clear that the EA shall not be involved in making political
choices or directly in policymaking.

The MoU, which outlines the Modalities and Procedures of Interaction, makes clear which
are the tasks of REA and the parent DGs respectively, and how these tasks should be
coordinated. In particular, Chapter 3 sets out arrangements for cooperation between the
Agency and the parent DGs in relation to other actors and institutions.

85
Commission Decision C(2013) 9418 of 20 December 2013 on delegating powers to the Research Executive
Agency with a view to the performance of tasks linked to the implementation of Union programmes in the field
of research and innovation comprising, in particular, implementation of appropriations entered in the general
budget of the Union, as amended by Commission Decision C(2014) 9450, Commission Decision C(2015)8754
and C(2017)4900.

76
The only area where the evidence of a grey line in the delineation of tasks could be
identified therefore relates to REA’s role in informing policymaking. While REA engages in
a number of activities that support policymaking, such as the project clustering approach
mentioned earlier as a good practice, it arguably could be made clearer how REA is
expected to contribute to policymaking in parent DGs, and where the delineation lies
between the Commission’s responsibilities for making policy on the one hand, and REA’s
responsibilities to inform policy development on the other.

In practice, the provision of feedback and extraction of project results across groups of
projects in the same thematic area through knowledge management mechanisms to
inform policymaking, and actually making EU policy is part of a continuum. This can
therefore result in ambiguities for REA in terms of how far it should be expected to go in
deriving policy-relevant research results. Reference should be made here to the question
within coherence relating to REA’s performance in extracting policy-relevant information
and data for parent DGs.

Feedback indicated that REA has not always received clear direction from the
policymaking DG as to their needs, and has had to anticipate what type of information is
actually needed by policymakers to inform policy development. There was also a concern
among REA staff and management that they do not always receive feedback as to
whether information provided by REA for policymaking purposes has been useful and
actually utilised.

While the delineation of responsibilities is generally clear from the establishment act and
delegated act, an example cited of the lack of clarity in the delineation of responsibilities
at a more detailed level related to who is responsible for extracting relevant information
related to cross-cutting calls with policy interest. However, the issue of who is
responsible for analysing the results of cross-cutting calls relates more to issues
surrounding the legal framework, which are addressed in the effectiveness section.

Given the lack of a standardised format or type of policy feedback to be provided, REA
decided to push for the creation of its own reporting tools to be used as a mechanism for
the identification of policy-relevant feedback. The Policy Feedback guidelines and
Catalogue of Policy Options were developed as a mechanism for standardising and
systematising REA’s work to strengthen policy feedback.

Notwithstanding, some progress has been made by REA in improving the quality of
guidance for its staff internally on inputs to policymaking and through the development
of mechanisms to capture policy-relevant information. Moreover, the interviews with the
CSC responsible for dissemination and exploitation revealed that REA has made progress
in capturing policy-relevant information, both through specific tools (clustering of
research projects to identify lessons learned), the development of internal REA guidance
on how to extract policy-relevant messages and to identify policy options and in
incorporating materials on different policy areas in REA’s AAR.

Regarding the extent to which the different responsibilities are being adequately
communicated to beneficiaries, interviewees observed that the external stakeholders,
including beneficiaries, are not always aware of the different roles and functions of the
Commission, REA and of different helplines and contact points. In particular, it was
observed that the Research Enquiry Service (RES) and National Contact Points
sometimes receive very similar requests for information from the public but through
different channels. The mandate of the RES is to answer specific questions about
research and also validation questions within SEDIA. Where, the RES central desk cannot
address questions specific to particular thematic open calls within Horizon 2020, they
transfer any such questions received to thematic decentralised helpdesks (which are an
integral part of the RES).

To what extent have the activities of REA resulted in unintended effects (both
desirable and undesirable)?
77
Sub-questions:
‒ What have been the effects, if any, of those unintended consequences on
policy design or programme delivery?
‒ What, if anything, could be done to minimise negative effects or
maximise positive effects of unintended consequences?
‒ Could anything be done to ensure such unintended consequences might
have been better foreseen?

This EQ relates to whether the REA’s activities produced any unintended effects, and if
yes, whether these were desirable or undesirable. To the limited extent that unintended
consequences can be identified, it will also be necessary to comment as to how far these
relate to programme or policy design, or to delivery and implementation issues.

A key finding was that while there have been significant changes to REA’s mandate,
following the expansion of its remit in Horizon 2020, with major organisational
implications for REA, such as a major increase in its staffing numbers, the expanded
mandate does not appear to have been problematic in terms of unintended consequences
or effects. There was generally sufficient time for REA to plan its organisational response
well in advance of new organisational developments, although the delays in the
development of the SEDIA IT tool did cause some unexpected problems in terms of the
transition to REA managing legal and financial validations not only for H2020, but from
2016, also a wider series of EU programmes. However, some unintended consequences
were nevertheless identified, such as the much higher number of remote evaluations
taking place than had been expected, and the difficulties in adapting business processes
to accommodate this.

Examples of areas where there were major changes between programming periods,
but there do not appear to have been unintended consequences are first provided. REA’s
mandate was revised in 201686 and some of the tasks delegated to REA, notably for call
planning, participant validation, and the verification of financial capacity, were extended
beyond Horizon 2020 and the programmes directly implemented by REA (and the
centralised support services performed for other DGs across the Research family) to
include other programmes (e.g. COSME, ERASMUS+, etc.). In particular, programmes
managed by DG HOME (AMIF, ISFP, ISFB) and DG JUST were also included. Although this
represented an increase in the portfolio of support services provided by REA, they were
not identified as being major implications in terms of unintended effects.

During the second quarter of 2017, the Commission also delegated responsibility to REA
for all legal and financial validations of legal entities participating in grant and
procurement actions under direct management in the context of setting up SEDIA.
Although this had been planned for some time, an example of an unintended
consequence is that being made the main future user of the corresponding IT tool has
put pressure on the Agency to play a greater role than it had initially envisaged vis-à-vis
DG DIGIT (the developer).

From 2017, REA’s mandate has also been expanded to cover projects generating
classified information. While this had been discussed for some time as a possibility,
the fact that REA is now able to handle projects generating classified information means
an improvement from a coherence perspective. There were therefore no unintended
effects.

Regarding additional unexpected issues during the 2015-2018 period of


operations, it was mentioned that there were a number of bugs in the IT system that
supports grants management. The CSC has been actively taking steps to address these

86
REA 2017 Annual Activity Report
78
and the necessary reprogramming and debugging was completed in August 2018. While
this has affected TTG, since responsibility for this was external to REA, it was beyond its
control.

The organisation of evaluations and the provision of administrative and logistical support
to experts were cited as examples of areas where the workload had been higher than
expected. The very high number of experts being managed by REA compared with
expectations has required the scheduling and rescheduling of a large number of
meetings. The project officers have often had to do the scheduling manually which has
been very time consuming.

A further area of REA’s operations where unforeseen developments took place was the
workload implications of the Agency’s Decision to organise a large number of remote
evaluations. Remote evaluation was seen as having been a challenge for all the units,
because of the volume of applications. For some calls, as many as 10,000 applications
are expected. Interviews with the staff from REA’s Unit A2 revealed that the CSC could
not develop a tool to schedule the large number of meetings linked to organising remote
evaluations and to accommodate the rescheduling of virtual meetings. As a result, in
some cases it was necessary to manually schedule them which was highly time
consuming. It was argued throughout the interviews that the development of a video
conferencing system would significantly advance the Agency’s operations and allow the
Agency staff to connect with experts working remotely more easily. The consequence of
the greater use of remote experts than had been expected was that REA has had to
undertake a lot of “learning by doing” in terms of manging remote evaluations in order to
gain experience. However, interviewees mentioned that further training will be needed
for REA staff as to how to manage the process of involving remote experts. There was
also an identified need for training videos for experts serving as remote experts.

To what extent has the Commission, in the presence of REA, been able to
maintain an adequate level of know-how in relation to the programmes
entrusted to the Agency? How has this been achieved?
What are the feedback channels, means and methods used for this purpose?
What are areas for improvement, if any?

In addressing the EQ relating to maintaining an adequate level of know-how, we have


drawn strongly on the document ‘Strengthening Policy Feedback in REA: Mapping,
Insights and Recommendations’ and the results from questions relating to the uptake of
knowledge outputs in Survey C.

The main finding was that various policy-feedback mechanisms have already been put in
place by REA to support the Commission’s policymaking activities. REA staff also
explicitly organised and contributed to meetings and conferences, which linked project
coordinators and Commission representatives, and shared information about new
developments in research that have policy implications.

Several new actions were taken during the evaluation period to help the Commission to
maintain an adequate level of know-how. A Catalogue of Policy Options containing
well over 60 measures has been developed. For a more detailed analysis of policy
feedback options outlined in the Catalogue of Options for Policy Feedback, please refer to
an in-depth study area on REA’s coherence and maintenance of know-how within the
Commission. In addition, REA collaborated with DG RTD on the P4P – Projects for
Policy initiative, which adopted a new approach to evidence-based policy making to help
better structure policy feedback efforts. Through this collaboration, the full potential of
REA-implemented programmes, including the MSCA within the Excellent Science pillar, is
being exploited.

Positive interpersonal interactions facilitating information flows between the Commission


and the Agency were reported throughout the interview programme. Despite this, some
79
interviewees noted that the externalisation of H2020 implementation to Executive
Agencies, including REA, has meant that the Commission has “lost contact” with
programme beneficiaries. Therefore, REA’s initiative to organise thematic cluster
meetings focusing on groups of projects in order to identify policy lessons was welcomed
by EC officials as a positive development. Another issue relevant to the level of know-
how maintained by the Commission related to the extent to which REA has been
proactive in responding to requests submitted by EC officials for policy feedback. EC
officials did not report any issues with regard to the effectiveness of the information flow
when a formal request for policy support has been formulated. The main weaknesses
identified referred to the way in which the Agency managed the extraction of policy-
relevant outputs from projects, where there was a specific, formal request for
information. Potential areas for improvement include the frequency of information
provision, which some Commission officials thought should be more regular, the format
in which information is provided, and the channels for transmission of policy feedback to
the Commission. The elaboration of the Catalogue of Options is a positive development.
However, further testing of the suggested measures and preparation of a set of effective
working methods would improve the know-how maintained in the Commission.

To what extent does REA’s communication function support the mission of the
Agency and does REA ensure an effective feedback loop with the policymaking
DGs?

The flow of information and communication between REA and the Commission services
appeared to work well during the evaluation period. REA had its own internal
communication function, but also relied upon support from DG RTD’s Communication
Unit. REA’s internal communication function appeared to support the achievement of the
Agency’s mission and its core objectives, for instance, by strengthening awareness about
success stories and through its interaction with policy DGs (including the parent DGs),
promoting the dissemination and exploitation of research results.

According to the REA Internal Communication Strategy 2017, there was a well-
developed mechanism for multi-channel internal communication. Several means of
communication were in place, including: 1) Intranet; 2) REA messages; 3) REA Info
Sessions and Science Conferences; 3) the Agency's quarterly magazine (The Reader); 4)
e-Bulletin (a bi-monthly electronic newsletter on REA administrative matters); 5)
Promotional material; 6) Audiovisual productions. The Strategy set out three main
priorities, namely: 1) fostering an effective two-way communication; 2) promoting best
practice in working methods across REA; 3) increasing awareness of REA activities. A
series of activities were foreseen: further strengthening of the REA Internal
Communication Info Sessions, reinforcement of the REA in-house video production
capabilities, restructuring of the Intranet in order to improve the usability of information
available, as well as to harmonise the Intranet pages.

REA had also adopted an External Communications Workplan87 which outlined a


number of priorities, actions and communication tools to strengthen external
communication, using a variety of communication tools. The external communications
work plan provided an effective framework for REA to undertake communications
activities.
REA’s external communication activities focused on three priorities 88 during the
evaluation period, including: boosting awareness on new funding opportunities and
broadening the participants' group, consolidating a service-oriented communication, and
supporting parent DGs in giving visibility to EU research via success stories and providing
input to the policy-feedback loop as discussed at the end of section 4.1.1. The relevant
information was reported via DG RTD and other parent DGs' Communication Units. Data

87
REA (2018). External Communication Workplan.
88
REA 2018 Interim Report, p. 70.
80
were also transmitted via specialised units such as the Research and Innovation portal
(InfoCentre), the Horizon Magazine, other press releases and publications. Overall, REA’s
activities in the area of external communications played an important role in helping the
policy DGs to communicate with the external stakeholders regarding the achievements of
projects, for instance through the identification and promotion of success stories and
work carried out to encourage new applicants to participate in Horizon 2020.

Have the monitoring, reporting and supervision arrangements in place enabled


the Commission to benefit, in the short and medium term, from the know-how
created within REA?

The main finding was that the AAR (and biennial report produced by REA) was a crucial
monitoring and reporting tool through which the Agency explained its activities and
operational performance to its parent DGs at the Commission and to wider stakeholders
during the evaluation period. It also provided an outlet for REA to demonstrate the
knowledge it has acquired across different policy areas. To this end, in the 2017 AAR, for
instance, there was a chapter dedicated to the different policy areas for which REA was
responsible.

There are however other mechanisms within the overall supervision arrangements for
sharing information with the parent DGs to ensure that there was appropriate scrutiny.
Based on the feedback obtained throughout interviews, the Research Budgetary
Network plays an important role in the overall coordination and information sharing
meeting on REA’s research budgetary matters. While REA and other Agencies are
sometimes present, the RBN is organised by DG RTD, and it involves mainly the
participation of other parent DGs, such as DG EAC and DG MOVE. The respondents from
several DGs noted that they were involved in reviewing the operational budget data
produced for the RBN. If specific questions/issues are raised on an ad hoc basis by
particular DGs about the budget, they are discussed through the network, with
clarifications being provided by REA where necessary. Regular monitoring, reporting and
supervision arrangements encouraged the development of a shared and common
understanding of REA’s operational budget across the different sub-programmes for
which the Agency was responsible. This in turn has contributed to effective supervision
and monitoring.

To what extent would the closing down of REA result in losing significant know-
how in relation to the management of the programmes entrusted to REA?

The counterfactual situation was analysed i.e. the extent to which closing down REA
would result in losing significant know-how in relation to the implementation of the
programmes entrusted to REA. Insofar as REA has been responsible for implementing
some FP sub-programmes for a decade, closing down REA could potentially result in a
significant loss of know-how in relation to the different business processes involved in
grants management.

The value added of REA is mainly embedded through the synergies generated and
efficiency gains made in the implementation of many different sub-programmes within a
single Agency. Furthermore, value has been generated through the Agency’s role in
providing common administrative support services and in the organisation of logistics for
experts, etc.) in an integrated manner by the Agency on behalf of the whole Research
family, instead of being implemented by several Commission DGs. Both the efficiency
savings identified in the CBA (see section 4.2) and the value added would be lost were
the REA to be closed down.

A further consideration was how far the counterfactual alternative scenario of the EC
implementing sub-programmes within H2020 would be realistic, given the trend towards
the externalisation of programme implementation. The interviews with REA staff and with
81
officials from the parent DGs suggested that they did not see the alternative scenarios as
realistic. Instead, they viewed it to be appropriate to continue the implementation of the
programmes for which REA is responsible to be implemented on an externalised basis,
due to the risks associated with a change in implementation structures. Even in the case
of programmes formerly managed by the Commission services and currently
implemented by REA, it was viewed as being unrealistic from a programme management
and implementation know-how perspective to close down REA, since the Commission
does not presently have the necessary human resources or technical capacity to perform
programme implementation tasks on the same scale as REA.

The “alternative scenario: could be more nuanced than simply closing down REA. For
instance, some aspects of H2020 Secure Societies were formerly managed in-house by
the EC during FP7 Security and could still potentially be managed by the EC, since some
know-how remained. However, this know-how has arguably been dispersed, since the
programme was managed by DG GROW (FP7), whereas in H2020, it was managed by DG
HOME. The fact that the scale of the RTD FP programmes has significantly increased over
time means that it would be difficult without a significant transfer of staff from REA for
either the Commission services or alternative Executive Agencies to perform its current
portfolio of programme implementation tasks.

The respondents from the Commission were positive about REA’s know-how from a
programme management and implementation perspective (e.g. the process of managing
calls for proposals, the application and selection process, grants management, including
contractual amendments and project monitoring). However, EC officials expressed
throughout the interview and survey programmes some reservations about the extent to
which REA has been able to retain knowledge that could be policy relevant. The fact that
the findings were less positive in this regard reflects the fact that REA covered a broad
range of sub-programmes within Horizon 2020, and very differing policy areas. It was
therefore not easy for the Agency to demonstrate in-depth knowledge of the policy
ramifications of research projects across all EU policy areas as consistently as the parent
DGs responsible for policymaking would like. However, the ongoing process which began
in 2018 and extends beyond the evaluation period of reaching common agreement with
each DG on their policymaking needs, and on the types of policy-relevant information
and knowledge outputs they would find most useful, based on the Catalogue of Policy
Options approach, should help to overcome this challenge, by allowing Agency staff to
learn more about given policy areas, and then to customise policy outputs accordingly.

4.2 Results of the retrospective cost-benefit analysis

4.2.1 Background of the quantitative CBA

According to Article 3(1) of Council Regulation (EC) No 58/2003, a decision on setting-up


an Executive Agency shall be based on a prior cost–benefit analysis (CBA). Further, as a
part of the interim evaluation required per Article 25, the costs and benefits of the
selected delegation scenario as identified by the original CBA shall be tested again and
the results of the CBA shall be updated if need be.

A detailed CBA of all the Executive Agencies (including REA) was conducted in 2013 for
the 2014-2020 Multiannual Financial Framework. The CBA compared the following
scenarios based on varying levels of delegation and distribution of programmes between
different Executive Agencies:

 An in-house scenario assuming that the new programmes would be managed by the
Commission while EAs would remain responsible for the delivery of legacy work (2007-
2013 MFF programmes);

 An initial delegation scenario established by the Commission;

82
 Two alternative delegation scenarios with different options for delegation among
Executive Agencies.

An overview of the initial delegation scenario and two alternative scenarios for the
delegation of programme implementation tasks to REA considered in the CBA is
presented in the table below.

Table 8. Overview of the different delegation scenarios.

Initial scenario

Legacy of 2007-2013 Proposed New Programmes (2014-2020)


Programmes

 FP7 Capacities: Research for SMEs  Horizon 2020: Marie Skłodowska Curie actions
and SME associations
 Horizon 2020: FET-Open
 FP7 Cooperation: Security
 Horizon 2020: Inclusive, innovative and secure societies
 FP7 Cooperation: Space
‒ Security research
 FP7 People
‒ Societies research
 FP7 support services  Parts of Horizon 2020 SME instrument financed from the
programmes delegated to REA (as listed above)

 Common support services to other bodies involved in the


implementation of Horizon 2020 and validation of FELs for
EAC/EACEA participants and the ERCEA

 European Maritime and Fisheries Fund – scientific advice

Alternative scenario 1

Changes to programme portfolios Treatment in initial scenario


of Executive Agencies

Delete from REA mandate:  Management of legacy by REA


 Transfer legacy of FP7-Space to
EASME (former EACI)

Add to REA mandate:  Delegation of H2020 societal challenge – Food security,


 Delegation of H2020 societal sustainable agriculture, marine and maritime research and
challenge – Food security, bio-economy to CHAFEA (former EAHC)
sustainable agriculture, marine and
maritime research and bio-economy
to REA

Delete from REA mandate:  Management of EMFF: scientific advice by REA


 Management of EMFF: scientific
advice by EASME

Alternative scenario 2

83
Changes to programme portfolios Treatment in initial scenario
of Executive Agencies

Add to REA mandate:  Management of legacy by REA


 Delegation of the new space
programme to REA  Delegation of the new space programme to EASME

Delete from REA mandate:  Management of H2020 SME instrument is split across four
 Centralised management of the Agencies (EASME, CHAFEA, INEA (former – TEN-T EA) and
entire H2020 SME instrument in REA) and the European Commission (H2020 societal
EASME challenge - Health, demographic change and well-being)

Add to REA mandate:  Delegation of H2020 societal challenge - Food security,


 Delegation of H2020 societal sustainable agriculture, marine and maritime research and
challenge- Food security, bio-economy to CHAFEA
sustainable agriculture, marine and
maritime research and bio-economy
to REA

Delete from REA mandate:  Management of EMFF: scientific advice by REA


 Management of EMFF: scientific
advice by the EASME

Source: CBA, SFS.

It was concluded that alternative scenario 2 was the most efficient in terms of the
potential cost savings and qualitative benefits. It was estimated that to manage
EUR 13.267 million annually (an increase of 127 % compared to 2013), the six Agencies
will need 2 887 FTEs in 2020 (an increase of 71 %, i.e. an additional 1 200 FTEs
compared to 2013). This compares favourably to the “in-house scenario” which would
require 3 088 FTEs to manage the same programmes. A further conclusion was that the
EAs will benefit from economies of scale as they become larger.

In order to achieve further efficiency gains, the Commission proposed a few adjustments
stemming from an improved level of productivity and aimed at containing administrative
costs through a 5 % staff reduction. This changed the ratio of budget ‘per head’ in the
case of REA from EUR 3.07 million to EUR 3.14 million in 2020, making it about 1.5 times
higher compared to the 2013 situation. Excluding central administrative and logistical
support services the estimated ratio of budget ‘per head’ would rise from EUR 2.69
million in 2013 to EUR 4 million in 2020. REA’s mandate related to its central
administrative and logistical support services was also significantly expanded. Overall,
the Agency was expected to achieve efficiency gains stemming from various sources:

 Simplification measures proposed for the 2014-2020 programmes;

 Continuous innovation and learning striving for organisational excellence;

 Optimising the delivery of some functions.

84
Table 9. Budget89 managed and human resources in REA compared to all Executive
Agencies in 2013 and 2020.

Executiv Budget FTEs in Budget Budget Envision Envision Budget Budget


e manage 201390 per to be ed FTEs ed FTEs per per
Agency d in head manage in 2020 in 2020 head in head in
2013 2013 d by EA (adjuste 2020 2020
(EUR (EUR in 2020 d) (EUR (EUR
million) million) (EUR million) million)
million) (adjust
ed)

REA 1 171 558 2.10 2 401 783 764 3.07 3.14

REA 1 171 436 2.69 2 401 614 600 3.91 4.00


(without
support
services)

All EAs 5 846 1 687 3.47 13 26791 2 887 2630 4.60 5.46

Source: Communication to the Commission on the delegation of the management of the 2014-2020 programmes to Executive
Agencies (SEC(2013)493) adapted by PPMI, SFS, own analysis.

These adjusted results of the CBA were used in the Specific Financial Statement 92 (SFS).
With regard to the forecasts for the administrative budget, the SFS differs from the CBA
in the following aspects:

 As explained above, the staff number was reduced by 3 % on average under the
Executive Agency scenario and by 2 % on average under the in-house scenario;

 The costs in the CBA were calculated in constant 2013 prices (i.e. neutralising the
effect of inflation). However, in the SFS similar estimations in constant prices were
used while being labelled as estimates in current prices without any further indexation.
In real terms, this constituted another reduction of the administrative budget with the
impact of such reduction gradually increasing over time;

 The SFS contained a number of inconsistencies both in the Executive Agency and the
in-house scenarios (e.g. inconsistent application of stated average cost assumptions).

Table 10 below summarises the assumptions used in the CBA and SFS for both scenarios
(the in-house and selected Executive Agency scenario). The CBA assumptions were not
modified in the SFS (except for constant vs current prices).

Table 10. Assumptions used in the ex ante CBA and SFS.

Assumptions CBA Assumptions SFS

In-house scenario

Description of the scenario Description of the scenario

89
EU budget only
90
Authorised number of staff
91
Adjusted by the Commission to EUR 14 358 million.
92
Specific Financial Statement related to the Decision Establishing the European Research Council Executive
Agency and repealing Decision 2008/37/EC.
85
 Legacy 2007-2013 programmes managed by REA No changes compared to the CBA.
until 2017. Any left-over legacy would then be
handed over to Commission Staffing Mix

 New programmes managed by the Commission. No changes compared to the CBA.

Staffing Mix Average Cost Assumptions

EC: Average annual staff costs – no changes


compared to the CBA.
 Establishment plan posts – 70 %;
Overheads – although the Annex to the SFS
 External – 30 %. provides for different average work
environment costs (Title II expenditure), the
REA: actually used rates are very similar to the
ones used in the CBA.
 TA – 25 %;

 CA – 75 %.
Also, it should be noted that the calculation
Average Cost Assumptions of the administrative appropriations contains
a number of inconsistencies and deviations
from the stated average costs assumptions.
European Commission:

 Establishment plan posts – EUR 108 000;

 Contract staff – EUR 47 000;

 Seconded national experts – EUR 55 000;

 Overheads – EUR 23 000.

REA:

 Establishment plan posts – EUR 100 243;

 Contract staff (CAs) – EUR 45 737;

 Overheads – EUR 21 281.

Average cost assumptions are based on DG BUDG


estimations.

The costs related to the programme support93 (Title III


expenditure) have not been included in the calculations,
as these are likely to be the same across all scenarios.
As such, these do not affect the cost differential
between the different scenarios.

Executive Agency scenario

Description of the scenario Description of the scenario


New and legacy programmes managed by REA. Same as in CBA.

Staffing Mix Staffing Mix

93
Such costs include experts, studies, representation and external meeting expenses; missions and related
costs; audit expenses; expenses of information, publications and communication; expenses of translation;
operational related IT costs.
86
Same as in-house scenario. Same as in CBA.

Number of staff includes 18.8 FTEs in the Commission Number of staff includes 14.9-13.2 FTEs
for the supervision and coordination with REA. over 2014-2020 in the Commission for the
supervision and coordination with REA.
Average Cost Assumptions
Same as in-house scenario. Average Cost Assumptions
Average annual staff costs – no changes
compared to the CBA.

Overheads – although the Annex to the SFS


provides for different average work
environment costs (Title II expenditure), the
actually used rates are very similar to the
ones used in the CBA.

Also cost estimations for Title III


expenditure (Programme support
expenditure) were added.

It should be noted that the calculation of the


administrative appropriations contains a
number of inconsistencies and deviations
from the stated average costs assumptions.

Source: CBA, SFS.

REA’s SFS was updated in 2017 following the update of the REA’s mandate. The update
of the REA’s mandate primarily concerns delegation of classified projects from SC7 to
REA and the expansion of centralised services for registration and validation of all
beneficiaries and service providers (SEDIA). Consequently, the retrospective CBA is
based on the SFS version of April 2017. It should be noted that:

 The updated SFS contains a number of inconsistencies in the Executive Agency


scenario (e.g. inconsistent application of stated average cost assumptions, similar to
the original SFS).

 The budget estimations of the in-house scenario were not updated (although the
Commission would require additional 51.6 FTEs to manage the updated mandate in
2020). Therefore, the in-house scenario’s budget in our analysis (Table 37) was
calculated on the basis of estimated No. of staff and CBA/SFS average cost
assumptions.

4.2.2 Actual staffing and costs of REA

During 2015-2018, the actual94 administrative budget of REA (see Table 11) constituted
EUR 250.375 million. The SFS estimations (EUR 255.016 million) were based on the EU
contribution only while REA’s administrative budget also included EFTA/EEA and third
country contributions (EUR 11.510 million during 2015-2018) to manage additional
operational budget. Based on the EU contribution only, the actual administrative budget
of REA constituted EUR 238.866 million, 6 % lower than estimated in the SFS, with
savings of EUR 16.15 million.

The costs in the 2013 CBA were calculated in constant 2013 prices (i.e. neutralising the
effect of inflation), however in the SFS, these estimations were used as current prices
without any further adjustment for inflation. In order to analyse this, the table below also
presents a comparison of the planned and actual REA budget, where SFS estimations

94
Executed commitment appropriations for 2015-2017, budgeted expenditure for 2018.
87
were adjusted to reflect current prices. Current prices were established using a fixed 2 %
annual deflator95.

Table 11. Actual and estimated REA’s administrative budget, EUR million.

Adminis- 2015 2016 2017 2018 Total 2015-


trative budget 2018

Title I. Staff SFS 38 113 39 951 40 520 42 875 161,459


Related
Expenditure
Actual 37 584 42 585 48 241 52 213 180,623

EEA and 3rd 1 167 1 166 3 080 2 940 8,353


country
contributions

Actual (EU 36 418 41 419 45 161 49 273 172,271


contribution)

Title II. SFS 13 924 15 063 15 710 16 915 61,612


Infrastructure
and Operating
Expenditure
Actual 8 801 9 156 9 352 9 920 37,229

EEA and 3rd 0 281 0 251 0 597 0 550 1,679


country
contributions

Actual (EU 8 519 8 905 8 755 9 370 35,550


contribution)

Title III. SFS 8 434 8 501 7 857 7 153 31,945


Programme
Support
Expenditure
Actual 7 803 7 998 9 279 7 443 32,523

EEA and 3rd 0 245 0 220 0 593 0 420 1,478


country
contributions

Actual (EU 7 559 7 778 8 686 7 023 31,045


contribution)

Total SFS 60 471 63 515 64 087 66 943 255,016

Actual 54 188 59 739 66 872 69 576 250,375

95
2 % annual deflator is provided for in the Article 6(2) of the MFF Regulation.
88
EEA and 3rd 1 693 1 637 4 270 3 910 11,510
country
contributions

Actual (EU 52 496 58 102 62 602 65 666 238,866


contribution)

Savings (SFS-Actual) 6,283 3 776 -2 785 -2 633 4 641

Savings (SFS-Actual EU 7,975 5 413 1 485 1 277 16 150


contribution)

SFS adjusted for current prices 62,914 67 403 69 370 73 910 273 597
using 2 % annual deflator

Savings (SFS adjusted for 8,726 7 663 2 498 4 335 23 222


current prices-Actual)

Savings (SFS adjusted for 10,418 9 301 6 768 8 245 34 731


current prices-Actual EU
contributions)

Source: SFS, annual financial reports, AWPs, own analysis.

Staff-related expenditure in 2015-2018 was 12 % (7 % based on the EU contribution)


higher than the initial SFS estimates, with an increasing excess over the period 2016-
2018. While in 2015 the actual staff-related expenditure based on the EU contribution
was 4 % below the SFS estimations, during 2016-2018 it surpassed the SFS estimations
by 4 %, 11 % and 15 % respectively.

There were a few reasons for this. Firstly, the actual average staff costs in 2015-2018
were higher than estimated in the SFS – the average actual staff related costs 96
amounted to EUR 110 538 for TAs (EUR 100 243, as per SFS estimations) and EUR 50
937 for CAs (EUR 45 737, as per SFS estimations). The average staff-related costs grew
over the evaluation period, which, combined with the decreasing vacancy rate at the end
of 2017, led to an increasing excess in Title I expenditure over the period 2017-2018.
The CAs profile also changed with a decreasing share of lower grade (FG I and FG II) CAs
and increasing share of higher grade (FG III and FG IV) CAs97, which contributed to the
growth of average staff costs of CAs. This shift to higher-function groups was required as
a result of a higher reliance on IT automation, allowing REA to generate efficiency gains
while concentrating its resources on more content-related tasks in proposal evaluation
and project-monitoring processes. Moreover, the share of lower function groups CAs (FG
III and FG IV) in REA still remained higher compared to the average of all Executive
Agencies in 2018. Secondly, as already indicated before, the calculation of the
administrative appropriations in the SFS contained a number of inconsistencies and
deviations from the stated average costs assumptions. If Title I expenditure were
calculated by multiplying the number of staff in REA98 by the stated average rates

96
Including professional development and social expenditure.
97
The same trend of increasing CAs grades was observed across all Executive Agencies and possibly related to
the changing working environment of the Agencies (automation of proposal and grant management processes
(paperless workflows), etc.).
98
SFS section 2.2.3 a) Estimated human resources need (in full time equivalents) (Executive Agency scenario).
89
(annual salary costs)99, Title I budget100 would be lower in 2015 and 2016
(correspondingly by 1.87 % and 0.51 %) and higher in 2017 and 2018 (correspondingly
by 0.19 % and 2.66 %) compared to Title I budget estimations provided for in the SFS.

The actual number of staff101 financed from the EU contribution was lower than the SFS
estimations (see Table 12), the composition of staff (the ratio between TAs and CAs)
corresponded to the SFS estimations. REA also employed additional CAs financed from
EEA/EFTA and 3rd country contributions.

Table 12. Actual and estimated number of REA staff.

2015 2016 2017 2018

Planned No of REA staff according to the SFS financed from the "EU 630 670 684 745
general budget"

Additional staff (CAs) financed from the 3rd country contributions 19 23 27 31

Total No of planned REA staff 649 693 711 776

Actual No of REA staff 618 628 693 736

of which TAs 154 146 163 175

of which CAs 464 482 530 561

Source: SFS, annual financial reports, AWPs, REA’s administrative data provided for CBA in 2018.

Infrastructure and operating expenditure were significantly lower than forecasted in


the SFS. The ratio of actual/budgeted infrastructure and operating expenditure in 2015-
2018 accounted for 58 % based on the EU contributions of the SFS estimations and
allowed to counterbalance the excess in Title I expenditure.

The actual programme support expenditure stayed very close to the SFS estimations
in 2015-2018.

4.2.3 Cost-effectiveness of the Executive Agency scenario and actual savings due to
externalisation

To assess whether the conclusions of the ex ante assessment (estimations on savings


provided in the CBA and SFS) are still valid when compared to the actual situation and
what the overall possible savings are, we:

1. Draw upon the actual performance of REA (actual execution of the administrative
budget, actual staffing, etc.);

99
SFS Annex: Methodology for calculating staff needs and administrative cost to be expected.
100
Title I budget estimation calculated by multiplying the number of staff in REA by the stated average rates
(annual salary costs) would constitute EUR 37.399 million in 2015, EUR 39.746 million in 2016, EUR 40.598
million in 2017 and EUR 44.015 million in 2018.
101
Actual number of staff at the end of 2015, 2016 and 2017 as well as the estimated number of staff for 2018.
90
2. Ensure comparability and validity of results by following the assumptions laid down in
the CBA and SFS and provide estimations of the comparable ‘actual’ in-house
(Commission) scenario (comparator), which would best reflect the actual situation;

3. Based on these estimations we assess whether the conclusions of the CBA and the
SFS ex ante assessment are still valid when compared to the actual situation and what
the overall savings are.

To deconstruct the “actual” in-house (Commission) scenario (“comparator”), we base our


estimations on the following CBA/SFS assumptions:

 Number and composition of staff in the Commission and REA under the in-house
scenario corresponds to the SFS estimations. Also, additional CAs were added to the
estimated Commission staff No. in 2015-2018 (19 CAs in 2015, 23 CAs in 2016, 27
CAs in 2017 and 31 CAs in 2018) to reflect the additional authorised staff in REA
financed from the EEA/EFTA and third countries’ contributions to manage additional
operational budget which was not covered in the CBA/SFS resources calculations
thereby ensuring comparability of the “in-house” and the “Executive Agency”
scenarios.

 The Commission staff costs and overheads correspond to DG BUDG estimations used
for the respective year (Table 13).

 REA’s average staff costs and overheads under the in-house scenario (for legacy)
correspond to the actual average REA staff costs 102 and overheads in the respective
year.

 The Programme support expenditure (Title III) stays the same under the Commission
scenario and the Executive Agency scenario.

Table 13. Estimated “actual” Commission staff costs and overheads, EUR.

2015 2016 2017 2018

Average staff costs: TA 109 000 111 000 115 000 119 000

Average staff costs: CA 47 000 47 000 47 000 50 000

Overheads 23 000 23 000 23 000 24 000

Source: data provided by DG BUDG.

The detailed results of the analysis of the estimated and actual costs of the in-house
(Commission) and the Executive Agency scenarios are presented in Annex 7 of this
report.

Figure 33 below provides a summary of the planned (as indicated in the SFS) and the
actual total costs under the in-house and Executive Agency scenarios in 2015-2018.

102
Including professional development and social expenditure.
91
400.000.000 344.651.804 362.920.000

300.000.000 264.750.000 257.820.358

200.000.000
79.901.804 105.099.643
100.000.000

0
SFS estimations Actual

Costs of the in-house scenario Costs of the Executive Agency (EA) scenario Savings of the EA scenario

Figure 33. Costs and savings of the Executive Agency scenario in 2015-2018, EUR.
Note: Costs of the Executive Agency scenario include costs of coordination and monitoring by the Commission. Source: SFS,
own analysis.

The results of the retrospective CBA show that:

 The overall actual costs of the Executive Agency scenario103 constituted EUR 257.8
million in 2015-2018. In order to evaluate to what extent the actual costs have
corresponded to the initial SFS estimates it is important to follow the same
assumptions that have led such SFS estimates. The SFS estimates (EUR 264.8 million
over 2015-2018) were based on the EU contribution, however REA’s administrative
budget also included EFTA/EEA and third country contributions (EUR 11.5 million over
2015-2018) to manage additional operational budget. Consequently, based on the
EU contribution only, the actual costs of the Executive Agency scenario
constituted EUR 246.3 million, which means that the actual savings compared
to the initial estimates for the Executive Agency scenario amounted to
EUR 18.4 million and accounted for 7 % of the SFS estimates. Significant cost
savings occurred in REA’s Title II “Infrastructure and operating expenditure.” The
costs in Title I “Staff related expenditure” were higher than estimated in the SFS,
which was related to higher average staff costs. Higher staff expenditure may become
an even more important issue in subsequent years since the average staff cost
estimates remain constant in the SFS for the 2014-2020 period, while the actual
average staff costs will rise further due to salary indexation, promotions and
increasing staff seniority.

 The costs of the Executive Agency scenario were much lower than the estimated costs
of the in-house scenario. In 2015-2018 the actual cost savings deriving from
the difference in the cost of the Executive Agency scenario and the in-house
scenario constituted EUR 105 million (or 29 % of the estimated costs under the
in-house scenario).

 Comparing the savings initially estimated in the SFS with the actual savings from the
delegation of tasks to REA, we found that the actual savings during the 2015-2018
period were higher than the initial estimates (EUR 105 million compared to EUR 80
million under the SFS estimates). As forecasted in the SFS and the ex ante CBA,
savings of the Executive Agency scenario primarily resulted from a higher share of
lower cost contract staff (CAs) employed within the Agency and lower overall number
of staff.

103
Including cost of coordination and monitoring by the Commission and costs of REA covered from EEA/EFTA
and third country contributions
92
4.2.4 Workload analysis

REA’s actual operational budget104 during 2014-2018 constituted EUR 9.109 billion, of
which EUR 602 million related to EEA/EFTA and third countries' contributions, which were
not considered in REA’s SFS. Without these EEA/EFTA and third countries' contributions
REA’s operational budget in 2014-2018 constituted EUR 8.508 billion and was 10 % lower
compared to the initial SFS estimates (EUR 9.495 billion) (please see Table 14 and Figure
34 below). For the actual operational budget of REA by funding source, refer to Annex 7.

Table 14. SFS estimated and REA’s actual operational budget 2014-2018105, EUR million

Planned in 2014 2015 2016 2017 2018 Total 2014-


the SFS/CBA 2018

FET OPEN 80.00 80.00 100.00 140.00 160.00 560.00

MSCA 701.18 743.04 790.26 838.64 889.18 3 962.29

ITN 356.92 378.23 402.27 426.89 452.62 2 016.93

IF 200.28 212.24 225.73 239.55 253.98 1 131.78

RISE 53.33 56.52 60.11 63.79 67.63 301.38

COFUND 90.64 96.05 102.16 108.41 114.94 512.21

NIGHT 0,00

SPACE 98.65 101.14 107.78 117.99 125.09 550.64

SC2 Agri 307.82 317.30 338.03 368.17 390.36 1 721.68

SC6 Societies 145.33 146.83 154.54 166.12 175.62 788.45

SC7 Security 131.22 135.92 161.18 165.24 187.14 780.69

SEWP 88.64 96.57 103.86 110.94 117.95 517.97

SWaFS 50.17 54.66 58.79 62.80 66.77 293.19

Experts 55.32 59.40 64.82 68.40 72.61 320.55

Total 1 658.34 1 734.85 1 879.26 2 038.29 2 184.71 9 495.45

Actual

FET OPEN 80.00 81.61 91.45 114.09 189.86 557.00

104
In commitment appropriations
105
In commitment appropriations
93
MSCA 841.64 811.03 829.81 900.65 986.82 4 369.94

ITN 440.17 426.39 437.69 482.33 540.84 2 327.41

IF 243.52 220.80 224.11 256.16 273.01 1 217.60

RISE 70.00 80.00 80.00 80.06 80.00 390.06

COFUND 80.00 83.84 80.01 82.10 80.94 406.89

NIGHT 7,95 0,00 8.00 0,00 12.03 27.98

SPACE 103.69 75.98 102.31 96.76 104.60 483.35

SC2 Agri 233.39 151.09 298.18 391.25 396.31 1 470.22

SC6 Societies 99.62 97.96 80.64 110.66 119.98 508.86

SC7 Security 147.12 127.59 88.54 143.79 219.83 726.86

SEWP 47.82 66.71 94.00 137.62 138.00 484.14

SWaFS 37.70 56.33 41.02 58.42 61.25 254.72

Experts 47.35 51.72 47.55 53.70 54.08 254.40

Total 1 638.31 1 520.00 1 673.51 2 006.94 2 270.73 9 109.50

Actual 65.95 127.83 102.07 138.94 167.08 601.87


EEA/EFTA and
third countries’
contributions

Actual EU 1 572.37 1 392.17 1 571.44 1 868.00 2 103.66 8 507.63


contributions

Actual EU 95 % 80 % 84 % 92 % 96 % 90 %
contributions/
Planned

Source: SFS, data provided by REA.

94
10.000 9.495
9.109
9.000 8.508
8.000 7.311
6.839
7.000 6.404
6.000 5.272
4.832 4.536
5.000
4.000 3.393 3.158
2.965
3.000
1.658 1.638 1.572
2.000
1.000
0
2014 2015 2016 2017 2018

SFS estimations Actual (all funding sources) Actual (EU contribution)

Figure 34. SFS estimated and REA’s actual cumulative operational budget 2014-2018106, EUR million.
Source: SFS, data provided by REA.

A more in-depth analysis showed that REA’s actual operational budget based on the EU
contributions was close to the SFS estimations for MSCA (102 % of the initial SFS
estimations) and FET OPEN (96 %) and was lower for SEWP (91 %), SC7 Security
(88 %), Space (82 %), SwafS (82 %), SC2 Agri (80 %) and SC6 Societies (62 %) (see
Table 15).

Table 15. SFS estimated and REA’s actual operational budget 2014-2018 by
programme107, EUR million.

SFS Actual Actual EU Actual EU


contributions contributions
compared to
SFS

FET OPEN 560.00 557.00 536.31 96 %

MSCA 3 962.29 4 369.94 4 041.49 102 %

ITN 2 016.93 2 327.41 2 045.82 101 %

IF 1 131.78 1 217.60 1 177.63 104 %

RISE 301.38 390.06 390.06 129 %

COFUND 512.21 406.89 400.00 78 %

NIGHT 0,00 27.98 27.98

SPACE 550.64 483.35 453.92 82 %

SC2 Agri 1 721.68 1 470.22 1 381.09 80 %

106
In commitment appropriations
107
In commitment appropriations
95
SC6 Societies 788.45 508.86 487.59 62 %

SC7 Security 780.69 726.86 683.84 88 %

SEWP 517.97 484.14 471.99 91 %

SWaFS 293.19 254.72 239.33 82 %

Experts 320.55 254.40 212.06 66 %

9 495.45 9 109.50 8 507.63 90 %

Source: SFS, data provided by REA, own analysis.

The Agency’s workload is closely linked with the allocated operational budget, number of
proposals received and number of grants concluded. In order to estimate the workload,
the simulator developed by REA was used in the 2013 CBA. The workload of the Agency
was estimated on the basis of the number of grants managed (stock of running projects)
and productivity indicators (number of person days required to manage a typical project
life-cycle).

The number of grants depends on the operational budget and the average grant size.
The average grant size is a variable that is influenced both by (1) the parameters fixed
in the design of the calls (such as the maximum budget of the project, maximum
duration of the project, requirements for partnerships, standard costs (flat rates, unit
costs, lump sums) used for the respective projects and the indicative size of an average
project) and (2) the individual characteristics of the selected projects. The average grant
size assumption implies a risk of underestimating/overestimating the actual workload of
the Agency. If the average grant size turns out to be lower than expected, this would
lead to a higher number of grants per allocated budget. Consequently, the workload and
the corresponding FTE requirements would be higher than expected.

Our analysis shows that the average grant size was overestimated for most of the
H2020 programmes and actions managed by REA 108, except for MSCA RISE and, to a
lower extend, for FET Open (see Table 16). A more detailed analysis also showed that
there was a tendency of increase in the average grant size in SC2, SC6 and SC7 and a
decrease in the average grant size in FET-Open during 2014-2018, which is bringing the
average grant size closer to the initial CBA estimations.

Table 16. Average grant size in 2014-2018 by programme, EUR million.

SFS Actual Actual EU


contributions
compared to planned
in the CBA

FET OPEN 2.00 2.41 121 %

MSCA

ITN 3.40 3.31 97 %

IF 0.25 0.18 72 %

108
With the exception of MSCA RISE and FET-OPEN
96
RISE 0.40 0.85 212 %

COFUND 3.10 2.95 95 %

NIGHT 0.19

SPACE 5.40 2.54 47 %

SC2 Agri 6.80 5.29 78 %

SC6 Societies 3.20 3.07 96 %

SC7 Security 5.90 4.33 73 %

SEWP 3.20 1.85 58 %

SWaFS 3.20 2.11 66 %

Source: CBA, data provided by REA, own analysis.

Considering the actual operational budget dedicated to the respective programmes (refer
to Table 17) and the actual average grant size, the actual number of new grants
managed by REA during 2014-2018 was 32 % higher than 2013 CBA forecast.

Table 17. CBA estimated and actual number of new grants across different programmes
and actions managed by REA under 2014-2018 calls.

CBA 2014 2015 2016 2017 2018 Total 2014-


estimations 2018

FET OPEN 40 40 50 70 80 280

MSCA 1 069 1 132 1 204 1 278 1 355 6 039

ITN 105 111 118 126 133 593

IF 801 849 903 958 1 016 4 527

RISE 133 141 150 159 169 753

COFUND 29 31 33 35 37 165

SPACE 18 19 20 22 23 102

SC2 Agri 45 47 50 54 57 253

SC6 Societies 45 46 48 52 55 246

SC7 Security 22 23 27 28 32 132

SEWP 28 30 32 35 37 162

97
SWaFS 16 17 18 20 21 92

Total 1 283 1 354 1 451 1 558 1 660 7 306

Actual

FET OPEN 26 27 42 71 65 231

MSCA 1 648 1 495 1 539 1 663 1 887 8 232

ITN 138 131 133 149 153 704

IF 1 347 1 243 1 239 1 404 1 551 6 784

RISE 90 91 91 89 100 461

COFUND 26 30 33 21 28 138

NIGHT 47 0 43 0 55 145

SPACE 42 34 32 33 49 190

SC2 Agri 53 30 57 70 68 278

SC6 Societies 38 36 23 34 35 166

SC7 Security 39 30 29 34 36 168

SEWP 45 67 11 73 66 262

SWaFS 20 23 22 24 32 121

Total 1 911 1 742 1 755 2 002 2 236 9 646

Actual/Planned 149 % 129 % 121 % 128 % 135 % 132 %

Source: CBA, data provided by REA.

However, increase in the number of grants managed by REA mostly relates to the
smallest grants – MSCA Individual Fellowships, which are characterised by the lowest
workload level per grant (number of person days required to manage a typical project
life-cycle).

Table 18. CBA estimated and actual number of new grants across different programmes
and actions managed by REA under 2014-2018 calls.

CBA Actual Actual Actual Actual


compared to attributable to attributable

98
planned the EU to the EU
contribution109 contribution
compared to
planned

FET OPEN 280 231 83 % 222 79 %

MSCA 6 039 8 232 136 % 7 922 131 %

ITN 593 704 119 % 619 104 %

IF 4 527 6 784 150 % 6 561 145 %

RISE 753 461 61 % 461 61 %

COFUND 165 138 84 % 136 82 %

NIGHT 0 145 145

SPACE 102 190 186 % 178 175 %

SC2 Agri 253 278 110 % 261 103 %

SC6 Societies 246 166 67 % 159 65 %

SC7 Security 132 168 127 % 158 119 %

SEWP 162 262 162 % 255 158 %

SWaFS 92 121 132 % 114 124 %

Total 7 306 9 648 132 % 9 270 127 %

Total 1 267 1 416 112 % 1 348 106 %


excluding
MSCA

Source: CBA, data provided by REA, own analysis.

The table below presents data on the CBA estimated 110 and the actual workload
parameters related to the central support services provided by REA over 2014-2017111.
The actual number of experts supported on site was lower than initially estimated,
however this could be attributed to the increased use of the remote evaluations over the
evaluation period. The number of contracted experts was lower compared to the initial
CBA estimations, however, the number of validations112 and financial viability

109
Number of grants attributable to the EU contribution under the programme/grant scheme was calculated
proportionally to the ratio between the EU contribution and the overall budget allocated to the respective
programme/grant scheme.
110
In addition to 2013 CBA estimates, the figures also include additional workload estimated under the 2015
Cost–Benefit Analysis for the delegation of the validation of legal entities and the preparation of legal entities
viability assessment to REA.
111
2014 data is included in analysis to provide a more comprehensive view.
112
Validation of legal status of entities (FEL) and extended mandate validation (LEAR).
99
checks/financial capacity assessments was higher compared to the initial CBA
estimations.

Table 19. CBA estimated and actual workload parameters related to the central support
services provided by REA.

2014 2015 2016 2017 Total 2014-


2018

CBA estimations

Expert
management

No. of experts 12 695 13 750 14 744 15 709 56 898


hosted

No. of 17 078 18 498 19 835 21 133 76 544


Appointment
Letters (expert
contracts)

Research Enquiry
Service

No. of RES No estimation


enquires received

Validation services

No. of legal status 8 875 9 373 10 515 10 418 39 181


of entities (FEL)
and LEAR
extended
mandate
validations

No. of Financial 1 411 1 529 1 639 1 747 6 326


viability checks
(FVC)

Actual

Expert
management

No. of experts 8 400 8 800 6 730 7 063 30 993


hosted on site

No. of experts 11 800 No data*


working remotely

No. of expert 11 400 11 000 13 100 15 700 51 200


contracts

Research Enquiry
Service

No. of requests 13 000 10 700 10 790 7 900 42 390


received by RES

Validation services

100
No. of legal status 5 923 7 400 7 320 7 800 28 443
of entities (FEL)
validations

No. of LEAR 11 390 10 450 7 700 8 900 38 440


extended
mandate
validations

No. of Financial 1 600 2 000 1 850 1 316 6 766


viability checks
(FVC)113

Note: The data on the external experts supervised remotely was not available.
Source: CBA, REA’s AARs, own analysis.

In 2017 REA was delegated a major role in the development and roll-out of the SEDIA.
Though the volume of transactions did not reach the CBA estimates in the first semester
of 2018 as mentioned in section 4.1.1., REA carried out important work in facilitating the
on-boarding of the services in preparation for a significantly higher workload in relation
to these activities during the second semester of 2018 114. The interview programme also
revealed that the workload parameters related to the SEDIA project were based on
rather general assumptions and it was recognised that the actual workload parameters
could significantly deviate from the initial estimates. Taking into account that the
evaluation period ends with the first semester of 2018, which was only the initial period
of roll-out of SEDIA, conclusions on the SEDIA-related workload would be premature and
should be postponed to a later stage. For more details on the implementation of SEDIA,
refer to Annex 5.

Overall, the actual workload115 of REA was higher than estimated in the 2013
CBA. In addition, the actual workload significantly deviated from the initial CBA
estimates across different programmes.

Many parameters influencing REA’s workload level (such as the operational budget
allocated to relevant H2020 programmes and actions, the average grant size and the
corresponding number of grants) were beyond the influence of REA. Information
collected during the evaluation showed that REA performed close and regular monitoring
of the actual workload level (monitoring of parameters of the programmes and grants,
updates of the productivity indicators and workload simulator, etc.), the results of such
monitoring were used for allocation of human resources across the Agency.

4.2.5 Qualitative aspects of the CBA

The qualitative aspects of the CBA indicated in the ToR (which reflect the CBA questions
provided in Article 3(1) of Regulation (EC) No 58/2003) were integrated into the overall
evaluation framework, i.e. they are presented in detail in sections 3, 2.1, 2.2 and the
quantitative retrospective CBA of the report, as shown in Table 20 below. Such approach
allowed avoiding duplication of work and ensured an integrated approach throughout the
evaluation exercise. Nevertheless, in this part of the report we provide a short summary
of the key findings concerning each qualitative and quantitative aspect of the CBA.

113
Financial capacity assessments (FCA) from 2017.
114
REA 2018 Interim Report, p. 37.
115
Based on the number of new grants, which was the main workload estimation parameter in the 2013 CBA.
101
Table 20. Qualitative and quantitative aspects of the CBA and their correspondence to
the evaluation sections.

Qualitative and quantitative aspects of the CBA Correspondence to evaluation


sections

Identification of the tasks justifying outsourcing Section 3 – REA’s Regulatory Framework,


Mission and Governance

The costs of coordination and checks Section 4.1.2 – efficiency and Section 4.2
(retrospective CBA)

The impact on human resources Section 4.1.2 – efficiency and Section 4.2
(retrospective CBA)

Possible savings within the general budgetary Section 4.1.2 – efficiency and Section 4.2
framework of the European Union (retrospective CBA)

Efficiency and flexibility in the implementation of Section 4.1.1 – effectiveness and Section
outsourced tasks 4.1.2 – efficiency

Simplification of the procedures used Section 4.1.2 – efficiency

Proximity of outsourced activities to final Section 4.1.1 – effectiveness


beneficiaries

Visibility of the European Union as promoter of the Section 4.1.1 – effectiveness


European Union programme concerned

The need to maintain an adequate level of know-how Section 4.1.3 – coherence


inside the Commission

Source: compiled by PPMI.

Identification of the tasks justifying outsourcing

Management tasks outsourced by the Commission to the Agency are clearly indicated in
the Commission Decision (Decision C(2014) 9450) of 12 December 2014 and the
Decision (Decision C(2017) 4900) of 14 July 2017 when REA’s mandate was adjusted
amending Decision 2014. The management tasks are justified in the preambles of these
documents as well as in the Delegation Act and other related documents. Outsourcing of
management tasks was within the limits of REA’s legal framework, i.e. the Commission
effectively retained its discretionary powers in translating political choices into action. For
more detail on this aspect, please refer to the sections 3.1, 4.1.1 and 4.1.3 of this report.

The costs of coordination and checks

The overall actual costs of the Executive Agency scenario 116 constituted EUR 257.8
million in 2015-2018. In order to evaluate to what extent the actual costs have
corresponded to the initial SFS estimates it is important to follow the same assumptions
that have led such SFS estimates. The SFS estimates (EUR 264.8 million over 2015-
2018) were based on the EU contribution, however REA’s administrative budget also
included EFTA/EEA and third country contributions (EUR 11.5 million over 2015-2018) to
manage additional operational budget. Consequently, based on the EU contribution only,
the actual costs of the Executive Agency scenario constituted EUR 246.3 million, which
means that the actual savings compared to the initial estimates for the Executive Agency
scenario amounted to EUR 18.4 million and accounted for 7 % of the SFS estimates.

116
Including cost of coordination and monitoring by the Commission and costs of REA covered from EEA/EFTA
and third country contributions.
102
The impact on human resources

Retrospective CBA revealed that one of the key challenges for the Agency during 2015-
2018 related to the fact that the actual workload of REA was higher than estimated in the
2013 CBA. As explained in section 4.2.4 many factors, which were beyond the Agency’s
control, influenced its workload, such as the average grant size and the number of
proposals and grants. REA also took a series of actions to cope with the varying levels of
workload across its different units. The Agency re-balanced (i.e. without any increase in
overall staff) its staffing levels across all of its units and the appropriate ratio of
administrative and operational staff with the adoption of various measures. For instance,
REA introduced a workload measurement exercise, which facilitated reallocations of staff
between different units and adopted a proactive approach towards the recruitment
process with the active participation of operational units.

The staff satisfaction survey carried out in 2016 revealed that REA’s key strengths were
related to overall job satisfaction and the staff satisfaction with the Agency as a modern
and attractive workplace. In addition, REA achieved positive results with respect to the
willingness of its staff to give extra effort when required, the level of understanding of
the Agency purpose as well as job clarity. REA was also effective in filling vacancies and
maintaining a relatively low vacancy rate (2 %) between 2015 and 2018. At the same
time, REA demonstrated less favourable results in terms of career development
opportunities and mobility, work recognition and equal opportunities and internal
communication. Although REA’s satisfaction with these aspects of their experience was
least favourable compared to other HR indicators, some of the results were still
somewhat higher than or equal to the levels of agreement for the Commission and other
Executive Agencies.

For more detail on the impact of externalisation on human resources please refer to
sections 3.1 and 4.1 of this report.

Possible savings within the general budgetary framework of the European


Union

The costs of the Executive Agency scenario were much lower than the estimated costs of
the in-house scenario. In 2015-2018 the actual cost savings deriving from the difference
in the cost of the Executive Agency scenario and the in-house scenario constituted
EUR 105 million (or 29 % of the estimated costs under the in-house scenario).

Comparing the savings initially estimated in the SFS with the actual savings from the
delegation of tasks to REA, we found that the actual savings during 2015-2018 period
were higher than the initial estimates (EUR 105 million compared to EUR 80 million under
the SFS estimates). As forecasted in the SFS and the ex ante CBA, savings of the
Executive Agency scenario primarily resulted from a higher share of lower cost contract
staff employed within the Executive Agency and lower overall number of staff.

Efficiency and flexibility in the implementation of outsourced tasks

Overall, REA was efficient in managing the delegated programmes and achieved very
good results in terms of most KPIs. Compared to the previous years, the Agency’s
performance further improved during the 2015-2018 period. This was accompanied by an
increasing level of satisfaction among REA’s beneficiaries with the performance of the
Agency during all stages of the project life-cycle.

Simplification of the procedures used

During 2015-2018, REA in cooperation with the Commission, continued the optimisation
of its procedures and programme management functions and introduced a number of
simplifications. The improvements and simplifications concerned wider use of remote
103
evaluation of proposals, remodelling of the evaluation building, improved procedures for
allocating proposals to the most suitable experts, measures to automate and improve the
detection of possible conflict of interest for experts, electronic workflows and wider use of
IT tools, etc. REA’s Networks played an important role in the optimisation and
simplification process. Some of the simplifications introduced by REA could be further
streamlined across other programmes and Executive Agencies.

Proximity of outsourced activities to final beneficiaries

During the evaluation period, REA adopted new measures to boost awareness of new
funding opportunities and opportunities for experts, to broaden the participants' group,
facilitate the consolidation of a service-oriented communication, and to support parent
DGs in giving visibility to EU research via success stories and input to the policy-feedback
loop. We found that REA’s beneficiaries were largely satisfied with the way the Agency
communicated with them during the project/contract implementation phase. For
instance, 74 % of respondents to the beneficiary survey were satisfied with the
responsiveness and competence of staff. Around 75 % of survey respondents claimed
that the feedback received on the progress with the content of their project or contract
was useful. For more information on this, please refer to section 4.1 of this report.

Visibility of the European Union as promoter of the European Union programme


concerned

A large majority of beneficiaries related the grants and tenders managed by REA both to
the EU budget (95 %) and to the European Commission as an institution (84 %). For
more information on this, please refer to section 4.1 of this report.

The need to maintain an adequate level of know-how inside the Commission

The evaluation found that REA has acquired substantial programme implementation
experience in the 2015-2018 period of operations. The Agency has also developed
expertise in the many thematic areas it works in, which have been expanded since the
2012-2015 evaluation was undertaken. However, there was a lack of uniformity in the
degree of attention given by REA to informing policy development across all EU policy
areas, with a perceived greater focus on some areas than others.

In the 2015-2018 period of operations, REA was found to have made a major effort to
ensure that regular communications (formal, informal) take place between the Agency
and the six parent DGs. Although the evaluation identified that the Agency faced a need
to retain the capability to customise IT tools to meet the differing requirements of the
parent DGs in some cases, REA was flexible to accommodate these requirements. For
more details, please refer to section 4.1.3 of this report.

4.2.6 Recommendations on improving the quality of future cost–benefit analysis

Our analysis showed that many aspects related to the quality of cost estimations were
already addressed in the 2013 CBA (e.g. while the initial CBAs did not take into account
the additional costs of supervisory functions performed in the parent DGs, these
additional costs have more recently been considered in the retrospective CBAs and CBAs
related to extension of EA mandates). Our analysis of the 2013 CBA exercise also
revealed that:

 Formally, the decision on the selection of delegation scenario was based primarily on
the costs of analysed different scenarios. However, the cost difference between

104
scenarios was marginal117 and the difference between the most and least expensive
scenarios constituted only 0.7 %.

 Results of the CBA were adjusted by the Commission to achieve further efficiency
gains (adjustment level varied significantly between Executive Agencies, in some
cases specific financial statements of the Executive Agencies contained a number of
inconsistencies, such as inconsistent application of stated average cost assumptions,
etc.); moreover, the actual costs of the Executive Agencies deviated from the initial
estimations. Combined, such deviations were significantly larger than cost differences
between the scenarios analysed in the 2013 CBA.

 Further, the cost-effectiveness gains in the case of all Executive Agencies primarily
resulted from the same source: a higher share of lower cost personnel (CAs)
employed within the Executive Agencies compared to the in-house scenario.

Therefore, comparison of the cost differences provided very limited information on the
different delegation alternatives, hence, the quality of future CBAs may be improved
through the following steps:

1. Improving the quality of the assessment of non-financial aspects, related to a)


improving service delivery at EAs (through increased specialisation and enhanced
commitment to specific results, better service delivery in terms of reduced time for
contracting, more rapid approval procedures for technical and financial reports and
lower payment delays, simplification of the procedures used, increased external
communication and dissemination of results, which contribute to enhancing the
visibility of the EU, ensuring continuity of the administration provisions, etc.), and b)
enabling the Commission to concentrate on its “core functions” such as policy design
and supervision while maintaining an adequate level of know-how related to
implementation of the programmes. For this element, our report identifies how the
qualitative aspects provided in Article 3(1) of Regulation (EC) No 58/2003 could be
operationalised and integrated into the overall framework of the CBA and evaluation.
The non-financial aspects of CBA could be assessed in qualitative terms.

2. Improved quality of the assessment of non-financial aspects would lead to better


informed and evidence-based decisions on allocation of programme
management tasks to the specific Executive Agencies. Allocation of tasks
between Executive Agencies could be subject to the initially set boundary conditions,
such as number and location of the Executive Agencies, size of the Executive Agencies
(to ensure that the Executive Agencies should be optimally sized in the future with no
Agency being ‘too large’ or ‘too small’ to yield economies of scale), etc.

3. This then could be followed by a detailed workload assessment exercise in the


specific Agencies (using a common methodology) and a corresponding
allocation of human and financial resources to the Executive Agencies of the
Commission.

5 Conclusions and Recommendations

5.1 Overall conclusions

In this part of the report, the conclusions and recommendations of the REA evaluation
are provided, first starting with the overall conclusions prepared according to our
organisational model (see section 2.2.1. of the report), which links the regulatory and
operational framework, with the enablers and results. To evaluate the overall
performance of the Agency, we also applied a conceptual framework connecting the

The cost of the initial scenario was estimated at EUR 1.613 billion, alternative scenario 1 – EUR 1.609 billion
117

and alternative scenario 2 – EUR 1.602 billion, all estimations are expressed in present value terms.
105
main elements of performance (objectives, inputs, processes, outputs and outcomes) in a
linear way.

Effectiveness

Value for
money

Objectives Inputs Process Outputs Outcomes

Process efficiency

Efficiency
Cost-
effectiveness

Figure 35. Framework of the Agency's performance.


Source: PPMI.

In terms of the regulatory and operational framework, REA operated under the new
legal framework of Horizon 2020 on delegation to Agencies and new management
modes. Concerning its mandate and responsibilities, the Agency successfully
accommodated the transfer of the new tasks during the evaluation period. In terms of
governance, the delimitation of responsibilities between REA and its parent DGs was
sound. During the evaluation period, REA and its new parent DGs further formulated the
sharing of responsibilities and collaboration with the adoption of a new MoU. The
internal organisation of the Agency closely corresponded to its mandate and matched
the specificity of the tasks delegated to it. Overall, the regulatory and operational
framework was sound and created good pre-conditions for the smooth operation of the
Agency and achievement of the expected results.

First, concerning enablers, REA successfully implemented its AWPs between 2015 and
2018, as well as effectively supported its parent DGs in reaching their objectives in newly
mandated activities such as the implementation of SEDIA. Second, in terms of HR
management REA linked its strategic and HR objectives by developing the new HR
strategy. The Agency also designed and advanced in the implementation of the dedicated
action plan focusing on a set of priority areas resulting from the 2016 Staff Survey.
Third, while implementing the programmes delegated to it, REA efficiently used other
organisational resources (finances, technology and information). Fourth, REA
cooperated well with its parent DGs and adequately used the services of external
experts, most of whom were satisfied with various aspects of the Agency’s performance.
However, it is possible to further formulate specific business processes between the
Agency and its parent DGs, including those related to the provision of policy feedback
tailored to the Commission’s needs. The Commission and REA also need to work together
to further clarify and consolidate several business processes related to selection of
experts, validation of expert lists and participation of EC officials in project-monitoring
activities. Fifth, the process of grant management in REA continued to be efficient. A
number of innovations and simplifications were introduced during the evaluation period,
which further contributed to the operational efficiency of the Agency. The evaluation also
revealed that the Agency could gain more flexibility in the implementation of cross-
cutting calls and additional types of instruments. Overall, the interaction of the different
enabling factors led to the efficient functioning of the Agency and its further
improvement, which translated into the achievement of good results during the period
2015-2018.

The results were divided (intended and unintended) into the following types: key
performance results, customer-oriented results, people results and policy results. Firstly,
key performance results represent the financial and non-financial performance of the
Agency. Secondly, customer-related results mirror the perception of the Agency’s work
by its clients (applicants and beneficiaries). Thirdly, people results address the
106
competences of REA staff and the know-how of programme management at the
Commission. Fourth, policy results correspond to the Agency’s contributions to
policymaking.

In terms of key indicators, our analysis indicates that REA performed in an effective,
efficient and cost-effective way in implementing its tasks during the period 2015-2018.
The Agency was largely effective in achieving its objectives and achieved very good
results in terms of most KPIs. Furthermore, REA continued to deliver a high quality and
effective service to its clients and other stakeholders through its central support
services. At the same time, the Agency expanded its portfolio of activities with the
development and roll-out of the SEDIA participant validation service.

We also measured efficiency in REA’s operation, which is defined as the ratio between
administrative budget (the costs in Title I “Staff related expenditure,” Title II
“Infrastructure and operating expenditure” and Tittle III “Programme support related
expenditure”) and the operational budget managed by the Agency. Our performance
measures demonstrating efficiency were budget “per head” (million EUR) and the ratio
between the administrative and operational budget (%). Our evaluation results indicate
that the ratio of budget “per operational head” increased from EUR 3.46 million to
EUR 3.54 million (if the budget is calculated in terms of commitments) or from EUR 2.44
million to EUR 2.99 million (in payments) per year in 2014-2017. This means that, on
average, REA staff managed an increasing volume of the operational budget during this
period, primarily because the growing volume of the operational budget outpaced the
increase in the number of the Agency’s operational staff (i.e. REA staff minus staff
working in the central support services) during this period.

Table 21. Performance indicators of REA, 2014-2017, EUR million or percentage.

2014 2015 2016 2017

Operational budget, 1 553.51 1 521.74 1 666.36 2 009.15


commitments

Operational budget, 1 096.88 1 418.48 1 642.94 1 697.18


payments

Administrative budget, 51.24 54.19 59.74 66.87


commitments

Administrative budget, 49.68 54.10 59.23 66.12


payments

Administrative budget 33.39 39.88 42.42 43.92


without the central
support services,
commitments

Actual number of staff 548 608 628 693


(at the end of the
year)

Actual number of 449 503 516 567


operational staff
(without the central
support services)

Programme 2.15 % 2.62 % 2.55 % 2.19 %


management cost at
REA (excluding the
central support
services),

107
commitments

Programme 3.04 % 2.81 % 2.58 % 2.59 %


management cost at
REA (excluding the
central support
services), payments

Budget ‘per 3.46 3.03 3.23 3.54


operational head,’
commitments

Budget ‘per 2.44 2.82 3.18 2.99


operational head,’
payments

Proposals received 11 473 15 415 13 005 14 927

Total running projects 6 925 7 020 6 658 6 420

Running projects ‘per 15.42 13.96 12.90 11.32


operational head’

Source: PPMI based on REA’s administrative data.

In terms of administrative efficiency (the ratio between the administrative and


operational budget), the cost of REA’s programme management decreased from 2.81 %
in 2015 to 2.59 % (in terms of payments) in 2017 by excluding the share of costs of the
central support services. This is due to a higher increase in the operational budget
compared to that in the administrative budget (without these services) during the period
2015-2017. REA also estimated the total cost of management without the cost of experts
charged to the operational budget which accounted for 2.36 % in payments made in
2017118. The fact that the Agency’s programme management cost was below 3 %
(excluding the central support services) and much below the maximum ceiling of 5 %
provided for the Commission’s administrative expenditure for H2020 confirms that the
Agency continued to be an efficient structure for the management of the delegated
programmes in the Commission.

Process efficiency is the extent to which programme management is simplified and


optimal processes are utilised for the management of applications and projects. During
the evaluation period 2015-2018, REA continued the optimisation of its procedures and
programme management functions and introduced a number of further simplifications in
cooperation with the Commission. For instance, the development of the IT tools and
electronic workflows improved the delivery of the programmes’ management functions
and increased the satisfaction of beneficiaries with most grant management processes.
However, the main areas with lower level of satisfactions were still related to the user-
friendliness of the corporate IT tools as identified during the previous evaluation (2012-
2015).

The number of proposals at REA fluctuated from 15 415 in 2015 to 14 927 in 2017. The
number of total running projects decreased from 7 020 in 2015 to 6 420 in 2017. During
the evaluation period, the actual workload of REA was higher than estimated in the 2013
CBA due to the higher than expected number of new H2020 grants, which was the main
workload assessment parameter. This also increased the workload of monitoring the
project implementation. However, the number of running projects ‘per operational head’
fell from 15.42 in 2014 to 11.32 in 2017 due to an increase in the number of operational

118
REA 2017 Annual Activity Report, p. 58.
108
staff during this period and closure of many small FP7 projects (MSCA individual
fellowships and grants).

Moreover, we measured cost-effectiveness, which is defined as the extent to which the


Agency achieved its outcomes at a lower cost compared with in-house management by
the Commission. We used the indicator of cost savings relative to the costs of in-house
management by the Commission. Our retrospective CBA found that the actual savings
were EUR 105 million during the period 2015-2018. This analysis indicates that due to
lower staff costs the Executive Agency scenario remained considerably more cost-
effective than the in-house scenario, generating substantial savings to the EU budget.

While the Commission’s Executive Agencies should keep the costs of programme
management at a reasonable level, they should also ensure the quality of their services
provided to applicants and beneficiaries. Therefore, we measured customer-related
results of REA. The survey of REA’s beneficiaries indicated a high level of satisfaction
with its performance. Overall, 86 % of respondents were very satisfied or satisfied with
the Agency’s services in 2018, further improving the level of satisfaction from the
previous evaluation (82 % in 2015).

Finally, REA achieved good results in terms of its people management. The Agency
scored above the averages of the Commission and its Executive Agencies for many HR
indicators (e.g. the Staff Engagement Index, Overall Job Satisfaction and Well-being). In
addition, there was an increase in the level of satisfactions of the Agency staff from 2014
to 2016.

Overall, a combination of the effective execution of the tasks delegated to the Agency (in
terms of KPIs), the low programme management costs and the high satisfaction of REA’s
beneficiaries point to high value for money generated by the Agency’s performance in
the period 2015-2018. In addition, the evaluation of the Agency’s operation indicates
that the establishment and further development of new management modes in the
Commission continued to provide a sound management framework and ensured the
effective and efficient implementation of the programmes delegated to the Agency.

The mandate of REA remains highly relevant to the needs of the Commission and the
Agency’s applicants/beneficiaries in the remaining part of the programming period (from
mid-2018 to 2020). The initial identification of tasks entrusted to the Agency is still valid
for justifying the delegation. Such judgement is supported by the findings of the CBA that
assessed the issues identified in Article 3 of Regulation 58/2003 (see section 4.2 of the
Final Report) and indicated significant advantages of the Executive Agency scenario
without finding any major drawbacks of delegation. Furthermore, this judgement is
supported by the specific conclusions presented in more detail in the section below.

Annex 6 of the report presents the general results of benchmarking comparing REA to
other Executive Agencies, while the findings of our specific benchmarking concerning the
management of human resources or client satisfaction are provided in section 4 of the
report and summarised in the specific conclusions and recommendations below.

5.2 Specific conclusions and recommendations

5.2.1 Effectiveness of REA

In 2015-2018, the Agency operated according to the legal framework establishing it and
was flexible and effective in addressing the key changes that occurred during the
evaluation period.

REA’s mandate was extended in 2014 with a significantly increased portfolio of H2020
programmes and administrative and logistical support activities. The mandate was
further extended in 2017 when, among other extensions of its mandate, the Agency
assumed a primary role in the implementation of SEDIA.
109
During the evaluation period, REA was also flexible enough to accommodate several
major changes that occurred during the evaluation period, including the adoption of a
new organisational structure and HR strategy, as well as the introduction of remote
evaluations. REA’s response to these key changes, especially to those induced by the
extended mandate, was appropriate and allowed the Agency to operate according to its
legal framework. The evaluation concludes that the Agency successfully accommodated
the handover of the new tasks and coped well with the increased mandate.

The legal framework and the Memorandum of Understanding set out flexible provisions to
ensure overall policy coherence and communication between REA and its parent DGs,
while ensuring that no micro-management was present. However, there was still a need
to further consolidate several processes related to selection of independent experts,
validation of expert lists, as well as participation of EC officials in project-monitoring
activities. A strengthened involvement of EC officials and an increase in their overall
knowledge of the specific business processes involved would increase the effectiveness of
the collaboration in the related areas.

The 2016 MoU set out modalities for the implementation of Horizon 2020 and FP7 that
allowed both REA and its parent DGs a level of flexibility to determine specific business
processes at the parent DG or unit level. The provisions have generally worked well and
the effectiveness of the collaboration was appreciated by both the parent DGs and REA.
Nevertheless, the evaluation identified several business processes where an improvement
was needed, such as selection of independent experts, validation of expert lists, as well
as participation of EC officials in project-monitoring activities. There were instances
where specific needs of the EC were not fully met, but also cases where REA’s proactive
actions and supply of information and communication were not always taken up by the
parent DGs. While these issues did not necessarily affect the working practices of the
parent DGs and REA with long-standing collaborations dating back to FP7, some related
issues were mentioned by several parent DGs and units where the collaboration started
more recently.

A resulting key finding which has emerged from the evaluation is that the overall level of
awareness of the specific business processes involved varied among the parent DGs and
units. While this implies a need for actions to improve the awareness among EC officials,
REA had a role to ensure that the information flow and business processes were tailored
to the specific needs of the parent DGs and their units. As a result, further consolidation
of the business processes in the identified areas should be implemented with the
proactive involvement of both the EC and REA.

Recommendation 1: further consolidate and streamline business processes related to


the selection of independent experts for evaluations, validation of expert lists, as well as
the participation of EC officials in project-monitoring activities. In doing so, the EC and
REA should work together to ensure that the detailed provisions and wider context of the
MoU, H2020 Vademecum and other relevant documents are clearly communicated to EC
officials.

Recommendation 1 is addressed to the Commission and REA.

Overall, REA was effective in achieving its objectives set out in the AWPs between 2015
and 2018, and continuously improved its operations even though it was already achieving
a very high level of overall effectiveness.

The evaluation found that the Agency fully achieved its objectives and KPIs set out in the
AWPs between 2015 and 2018. Overall, REA was responsive to changing policy contexts
as illustrated by the Agency’s effective actions in the areas where policy developments
necessitated prompt responses from the Agency. In line with a considerable increase in
its scope of activities following the extension of its mandate in 2017, REA continued to
further improve and align its business processes and tools, organisational structure and
110
HR management, as well as its internal control system. These and other changes
introduced enabled the Agency to effectively cope with a higher workload than was
estimated in the 2013 CBA without an increase in staff beyond what was originally
planned in the said exercise.

Concerning the lessons learned during the 2015-2018 period, the evaluation identified
that the growing prevalence of actions with policy interest required REA and its parent
DGs to formalise the specific modalities and procedures for their implementation.

There was an increasing demand for REA to assume the implementation of new types of
activities due to the delegation of large parts of the H2020 programmes to the Agency. In
some programmes, REA was implementing almost entire programme portfolios except for
a small number of projects which were still implemented in-house at the Commission.
Since REA was implementing very significant parts of several H2020 programmes, they
included some of the actions that possessed a higher degree of policy relevance to the
Commission. The growing prevalence of such actions with policy interest required specific
implementation modalities which would ensure a closer involvement of the EC in the
project monitoring and policy feedback activities while maintaining REA’s primary roles
and responsibilities. At the time of the evaluation, the legal framework, which regulated
the collaboration between the parent DGs and REA, lacked specific modalities and
procedures for the implementation of actions with policy interest.

Recommendation 2: consider formalising the specific modalities and procedures for the
implementation of actions with policy interest. The implementation modalities and
resulting business processes should allow the Commission to be more closely involved in
the project implementation activities, while maintaining REA’s principal roles and
responsibilities as described in its legal framework.

Recommendation 2 is addressed to REA and the Commission.

The evaluation identified that another potential area for improvement related to the
implementation of cross-cutting calls. However, it is expected that the additional degree
of flexibility envisaged under the next EU Framework Programme for Research and
Innovation will contribute to the efficiency gains and economies of scale achieved by the
Agency in the implementation of cross-cutting calls.

REA’s beneficiaries, unsuccessful applicants, external experts contracted by the Agency


and EC officials with whom REA staff worked during the evaluation period were overall
satisfied with the performance of the Agency.

Overall, around 86 % of respondents to the survey of beneficiaries were satisfied with


the services provided by the Agency. This result is above the overall level of satisfactions
reported by the beneficiaries of CHAFEA (74 %), and on par with the level of satisfactions
reported by the beneficiaries of EACEA (89 %) and ERCEA (89 %). This shows that REA
was among the best performing Executive Agencies. In addition, the overall satisfaction
with REA’s performance was lower among the surveyed EC officials (79 %) and
unsuccessful applicants (55 %). However, the latter result may relate, at least to some
extent, to the negative outcome of their application process (rejection of the application).

111
3,3% 1,5%
Beneficiaries (N=583) 41,5% 44,6% 9,1%
7,0%
EC officials (N=43) 34,9% 44,2% 14,0%
6,1%
6,5%
Unsuccessful applicants (N=262) 6,9% 48,1% 32,4%

0% 10% 20% 30% 40% 50% 60% 70% 80% 90% 100%

Very satisfied Satisfied Neither satisfied nor dissatisfied Dissatisfied Very dissatisfied

Figure 36. Overall satisfaction of beneficiaries, applicants, and EC officials with the performance of the Agency.
Source: survey of REA’s beneficiaries, unsuccessful applicants and EC officials.

The Agency worked closely with its beneficiaries and effectively served as a direct contact
point for the applicants and beneficiaries of EU funding.

REA took proactive measures to boost the awareness of new funding opportunities for
applicants and opportunities for experts, to broaden the participants' group, facilitate the
consolidation of a service-oriented communication, and to support parent DGs in giving
visibility to EU research via success stories and input to the policy-feedback loop. As a
result, around 80 % of the surveyed independent experts believed that the information
on how to become an external expert was easily available. For example, around 3 in 4 of
the surveyed beneficiaries agreed that REA staff assigned to their project were easily
available and responsive during the implementation, or that the feedback received on the
progress with the content in their project was useful. In addition, REA continuously
provided success stories and other dissemination material to DG RTD, and to other
parent DGs for publication and contributed to numerous events and publicity campaigns.
This resulted in an increased visibility of programmes managed by REA and contributed
to the overall visibility of the EU FPs.

Regarding REA’s contribution to visibility of the EU as a promoter of the programme, a


large majority of beneficiaries related the grants managed by REA both to the EU budget
(95 %) and to the European Commission as an institution (84 %).

5.2.2 Efficiency of REA

Overall, REA was efficient in managing the delegated programmes and achieved very
good results in terms of most KPIs. Compared to the previous years, the Agency’s
performance further improved during the 2015-2018 period. This was accompanied by
an increasing level of satisfaction among REA’s beneficiaries with the performance of
the Agency during all stages of the project life-cycle.

REA further improved its performance in terms of the timely conclusion of grant
agreements (measured in “Time-To-Grant” – TTG) both over the evaluation period and
especially when compared to previous years. The average REA TTG decreased from 222
days in 2014 to 193 days in 2016-2018, or 52 days below the H2020 target of 245
days. The share of grants concluded within TTG targets reached 99 % for 2015 calls
and nearly 100 % for 2016-2017 calls.

Regarding payments for grants, the average Time-to-Pay (TTP) stood well below the
contractual thresholds for all types of payments (pre-financing, interim and final
payments) in 2015-2018 both for FP7 and H2020. Concerning the share of payments
within contractual limits, nearly 100 % of all pre-financing payments in 2015-2018
were executed on time. Regarding interim and final payments, REA maintained a
similar performance level compared to the previous evaluation period for FP7 (94 % of
payments executed on time) and during 2017-2018 it improved its performance for
H2020 (98 % of payments executed on time). REA also managed to achieve full
execution of its operational budget both in commitment and payment appropriations.
112
Regarding the legality and regularity of transactions, the estimated residual multi-
annual error rate remained in a similar range over 2014-2018 compared to those in the
previous years for all FP7 programmes implemented by REA. The residual multi-annual
error rate for the SME actions, Space and Security was above the materiality threshold
of 2 %, while that for the People Programme remained below 2 %. It should be noted
that the estimated error rate in the SME legacy programme could, to a large extent, be
attributed to the complexity of the funding scheme and not to underperformance of the
Agency. The error rates observed in this programme remained in line with the general
FP7 trend.

In line with the previous evaluation period, during 2015-2018 REA proved to be an
efficient and cost-effective structure for the management of the delegated
programmes. The overall cost-effectiveness of REA grew over 2010-2017.

The surveyed beneficiaries were generally satisfied with REA’s performance during
various stages of the project life-cycle, and the level of satisfaction grew compared to
the previous 2015 beneficiaries’ survey. However, the main areas with lower levels of
satisfaction were still related to the user-friendliness of the IT tools employed throughout
the project management life-cycle119. On the one hand, our analysis showed that the
corporate IT tools were major contributors to the improved KPIs and increased the
beneficiaries’ level of satisfaction related to most processes in the grant management
life-cycle. On the other hand, the beneficiaries still regarded the user-friendliness and
overall experience of using the IT tools as an area for improvement. As a result, further
efforts to improve the user-friendliness and user-experience of the corporate IT tools
related to programme management should be considered.

Overall, REA delivered a high quality and effective service to its clients and other
stakeholders in the area of the management and provision of central support services,
achieving and exceeding the KPIs set.

With regard to the supervision of proposal evaluation activities and the management of
central support services, the Agency coped well with the increased workload following the
expansion of its mandate in 2017. The Agency improved its performance with respect to
time-to-pay for expert evaluators during 2015-2018 compared to the previous evaluation
period. The feedback received from the independent experts contracted by the Agency
suggests they were highly satisfied with the service provided and almost all experts (i.e.
99 %) were certainly or potentially willing to work with REA in the future. Although the
overall processes of registration, selection and contracting of experts were assessed
more favourably compared to the previous evaluation period, the levels of satisfaction
remained less positive in several areas. For instance, they were less positive about the
online evaluation system used, quality of the templates used for the reports, and the
time available for drafting the reports120.

With regard to the management of the Research Enquiry Service, Financial Capacity
Checks and the participant Validation Services, the overall performance of the Agency
was favourable and target deadlines for validations were fully respected between 2015
and 2018.

REA’s internal organisation and procedures were conducive to the specificity of the tasks
delegated to the Agency. The new HR strategy effectively linked REA’s multi-annual HR
objectives with its strategic objectives. Additionally, the HR strategy effectively supported
the implementation of a dedicated action plan, which effectively addressed the Agency’s
main HR challenges in the areas of concern identified by the 2016 Commission staff
satisfaction survey.

Which for H2020 are developed centrally by the Commission’s Common Support Centre.
119

These issues are not fully under the control of REA as it relies on corporate tools and templates provided by
120

DIGIT/CSC or timing that results from the TTG targets.


113
In the 2016 European Commission Staff Survey, REA demonstrated consistently high
results compared to the wider Commission and other parent Agencies in terms of staff
engagement, overall job satisfaction and well-being. With respect to the Agency’s HR
management, its key strengths were related to overall staff satisfaction with the Agency
as a modern and attractive workplace. In addition, the Agency achieved positive results
with respect to the willingness of its staff to give extra effort when required, the level of
understanding of the Agency’s purpose as well as job clarity. REA was effective in filling
its vacancies and maintaining a relatively low average vacancy rate (2 %). REA staff
turnover rate decreased from 5.7 % in 2015 to 3.1 % in 2018 and stood at 4 % on
average during the evaluation period. The Agency demonstrated less positive results in
terms of career development opportunities and mobility, work recognition and equal
opportunities as well as internal communication.

The Agency has stepped up its HR management with the adoption of its 2016 HR Action
Plan and through a comprehensive multi-annual HR strategy. The adoption of specific
HR monitoring and reporting measures allows REA to effectively track its progress,
particularly in the areas of career development opportunities and mobility, work
recognition and equal opportunities and internal communication.

5.2.3 Coherence of REA

No evidence of programming overlaps, duplication, gaps or inconsistencies were found


across the programmatic portfolio falling under REA’s delegated responsibility. Although
REA has been managing a diverse portfolio of sub-programmes within Horizon 2020,
and its mandate has again been extended in 2015-2018, the portfolio of programmes
delegated to REA remained coherent.

Since 2017, the Agency’s remit was extended to provide common administrative and
logistical support services to the rest of the research and innovation family involved in
H2020 implementation, and also to a wider range of Commission DGs. The MoU, which
was signed between the six parent DGs and the Agency in 2016, did not take these
changes into account. However, three separate legal documents were adopted in 2018
to guide REA’s operational work related to programme implementation, expert
management and the provision of the validation support services to EU bodies
managing grant and procurement procedures in the context of its extended mandate.
As a result, there were not found to be any coherence-related issues, since the types of
support being provided, such as the legal and financial validation of successful
applicants, involved a set of common business processes, irrespective of the EU
programme type.

The evaluation observed that significant progress has been made in strengthening
REA’s inputs to policymaking, particularly in terms of the quantity and quality of
feedback. While this was strongly appreciated by some DGs, conversely, the evaluation
identified cases where the feedback produced by REA was only partially taken up. This
suggested a need for policy DGs to work more closely with REA to formulate their
policymaking needs, and to agree on the preferred communication dissemination
channels.

In common with some Executive Agencies (e.g. EASME, and also CHAFEA to some
extent), REA faces an organisational challenge that it is answerable to multiple parent
DGs, which are responsible for a diverse range of policy areas. The challenges that REA
faces in addressing differing expectations across its six parent DGs as to what types of
knowledge outputs could best inform EU policymaking need to be recognised.
Accordingly, REA has initiated discussions with its parent DGs to confirm what their
policymaking needs are in order that REA can better customise the policy feedback it
provides.

The previous ‘Evaluation of the three years’ operations of REA, 2012-2015’ identified a
number of shortcomings relating to the effectiveness of support to policymaking, such
114
as the need for more regular and structured dialogue, and the importance of putting in
place a common formal methodology to provide policy feedback, supported by common
practices and more systematic reporting.

The present evaluation has identified evidence of changes made in the 2015-2018
period of operations in response to the recommendations from the earlier 2012-2015
period. REA initiated regular and structured dialogue with its parent DGs to develop a
strategy and action plan on policy feedback. The steps taken to strengthen strategic
consideration of the policy relevance of research results are positive compared with the
baseline situation.

The types of operational measures implemented by REA include the development of a


Catalogue of Policy Options and supporting guidance document. Such measures provide
evidence of tangible progress compared with the situation in 2012-2015, since some
initiatives are already well-embedded, such as the organisation of thematic project
cluster meetings. These were strongly appreciated by some DGs. The catalogue
presents an extended inventory of alternative policy feedback outputs that can be
produced by the Agency. The Agency has also held coordination meetings, contributed
to the preparation of policy reports, participated in thematic events organised by the
EC, contributed to WP implementation/planning, produced reports for Programme
Committees and/or Advisory Groups, produced inputs for EC communication and
dissemination events, collected feedback from beneficiaries, and contributed to the
identification and dissemination of success stories. In addition, it has been involved in
numerous other policy feedback activities.

The survey of EC officials confirmed a high level of satisfaction with the frequency and
quality of outputs produced. Nevertheless, despite the steady stream of policy feedback
and its overall good quality, only 38 % of EC officials surveyed indicated that they were
provided with sufficient policy feedback to inform their policymaking tasks. Whereas
some officials recognised that the knowledge outputs being generated were relevant to
their policymaking needs, others would have preferred more customised policy outputs
tailored to their needs.

As a result, the extent of take-up of policy outputs by parent DGs was found to have
varied considerably depending on the DG concerned, and the types of policy feedback
outputs produced. The survey of EC officials revealed that more than half of all the
policy feedback outputs produced by REA were rarely used even if their overall quality
was satisfactory. This suggests that the Agency was in some cases investing resources
on producing outputs which were only partially used by the EC.

The variation in the take-up of policy outputs by parent DGs may be explained by the
fact that the needs of policymakers have not yet generally been formulated in a clear
and concrete manner by all parent DGs, which has made it challenging for REA to
provide customised feedback that addresses needs satisfactorily. The evaluation
identified differences between ‘top-down’ and ‘bottom-up’ research programmes in this
regard. In the case of top-down research programmes, where policymakers develop the
AWP, the link to policy feedback activities is more direct compared with bottom-up
research programmes, such as the MSCA, where different types of feedback are
required.

A further finding was that policy feedback appeared to have been better appreciated
where REA has been managing programme implementation for some time, whereas
there were sometimes gaps in expectations as to what it was feasible for REA to
provide within its resource constraints among DGs where the Agency has only recently
taken over responsibility for particular programmes. The study found evidence that the
efficiency and effectiveness of the policy feedback function of the Agency could be
strengthened.

115
Recommendation 3: continue using the Catalogue of Policy Options developed by REA
as the basis for establishing a set of specific policy feedback mechanisms and business
processes most relevant to parent DGs. The following potential actions are
recommended:
 Commission DGs, with the involvement of REA, need to better formulate their
policy-feedback needs and to specify in which format(s), and at which frequency,
they would prefer policy inputs to be produced. The established policy-feedback
needs should give due consideration to capacity constraints at the level of REA.
 Formalise the agreed short list of options and the related business processes
between the Commission and REA at the unit/parent DG level.

Recommendation 3 is addressed to the Commission and REA.

REA played a key role in supporting information flows between the Agency and the
Commission services on programme implementation. It also played an important role
as a competence centre with regard to the implementation of the EU FPs and in the
provision of administrative and logistical support services.

Notwithstanding, some DGs perceived there to be a lack of uniformity as to the degree


of attention given by REA to informing EU policy development across all thematic areas
for which it is managing programmes. There was a perception of a greater focus on
some areas than others. While evidently REA needs to take this on board, and to
actively monitor how far it is giving reasonably proportionate attention across the many
thematic domains in which it works to different policy areas, at least some knowledge
outputs have been produced relating to most policy domains within scope. For example,
the AAR contains an update on each policy area and the thematic cluster meetings have
been organised across many different policy domains.

In the 2012-2015 period of operations, the evaluation observed a lack of coherence in


the approaches taken to communication between the different parent DGs.
Nevertheless, communication approaches to monitoring and reporting across REA’s
portfolio have been standardised. In the 2015-2018 period, REA was found to have
made a major effort to ensure that regular communications (formal, informal) take
place between REA and the six parent DGs.

In terms of the know-how generated by the Agency, substantial programme


implementation experience has been acquired by REA since its establishment, which
has been consolidated in the 2015-2018 period. The Agency has also developed
expertise across many thematic areas, since its overall programming mandate has been
significantly expanded since the 2012-2015 evaluation was undertaken. This know-how
has been largely retained within REA, with staff turnover levels proportionate to those
in other Executive Agencies. REA demonstrated that it has been able to work together
with IT developers, especially DG DIGIT and the CSC in the development of common
Horizon 2020 IT tools to facilitate programme management. Although the evaluation
identified that the Agency faced a need to retain the capability to customise IT tools to
meet the differing requirements of the parent DGs in some cases, REA was flexible to
accommodate these requirements. It is noteworthy, however, that continuous efforts
should be undertaken to optimise and make the IT tool supporting SEDIA more
efficient, which is crucial to underpinning REA’s work.

5.2.4 Retrospective CBA

Our retrospective CBA revealed that the actual costs of the Executive Agency scenario 121
constituted EUR 257.8 million in 2015-2018 (please refer to Figure 37 below). In order to
evaluate the extent to which the actual costs corresponded to the initial SFS estimates, it

121
Including cost of coordination and monitoring by the Commission and costs of REA covered from EEA/EFTA
and third country contributions
116
was important to ensure that the same assumptions were used as for the earlier SFS
estimates. The SFS estimates (EUR 264.8 million over 2015-2018) were based on the EU
contribution, however REA’s administrative budget also included EFTA/EEA and third
country contributions (EUR 11.5 million over 2015-2018) to manage additional
operational budget. Consequently, based on the EU contribution only, the actual costs of
the Executive Agency scenario constituted EUR 246.3 million, which means that the
actual savings compared to the initial estimates for the Executive Agency scenario
amounted to EUR 18.4 million and accounted for 7 % of the SFS estimates. Significant
cost savings occurred in REA’s Title II “Infrastructure and operating expenditure.” The
costs in Title I “Staff related expenditure” were higher than estimated in the SFS, which
was related to higher average staff costs. Higher staff expenditure may become an even
more important issue in subsequent years since the average staff cost estimates remain
constant in the SFS for 2014-2020 period, while the actual average staff costs might rise
further due to salary indexation, promotions and increasing staff seniority.

400.000.000
344.651.804 362.920.000
350.000.000
300.000.000
264.750.000 257.820.358
250.000.000
200.000.000
150.000.000
79.901.804 105.099.643
100.000.000
50.000.000
0
SFS estimations Actual

Costs of the in-house scenario Costs of the Executive Agency (EA) scenario Savings of the EA scenario

Figure 37. Estimated costs and savings of the Executive Agency scenario in 2015-2018, EUR.
Source: SFS, own analysis.

The costs of the Executive Agency scenario were much lower than the estimated costs of
the in-house scenario. In 2015-2018 the actual cost savings deriving from the
difference in the cost of the Executive Agency scenario and the in-house
scenario constituted EUR 105 million (or 29 % of the estimated costs under the in-
house scenario). Comparing the savings initially estimated in the SFS with the actual
savings from the delegation of tasks to REA, we found that the actual savings during the
2015-2018 period were higher than the initial estimates (EUR 105 million compared to
EUR 80 million under the SFS estimates). As forecasted in the SFS and the ex ante CBA,
savings of the Executive Agency scenario primarily resulted from a higher share of lower
cost contract staff (CAs) employed within the Executive Agency and the lower overall
number of staff.

Overall, the actual workload 122 of REA was higher than estimated in the 2013
CBA. In addition, the actual workload significantly deviated from the initial CBA
estimates across different programmes. Many parameters influencing REA’s workload
level (such as the operational budget allocated to relevant H2020 programmes and
actions, the average grant size and the corresponding number of grants) were beyond
the influence of the Agency. Information collected during the evaluation showed that
REA performed close and regular monitoring of the actual workload level (monitoring of
programme parameters and grants, updates of the productivity indicators and workload
simulator, etc.), which allowed the allocation of human resources across the Agency in
the most appropriate way.

122
Based on the number of new grants, which was the main workload estimation parameter in 2013 CBA
117
REFERENCES

General documents concerning the legal basis, financial management and


Executive Agencies
1. Communication to the Commission on the delegation of the management of the
2014-2020 programmes to executive agencies (SEC(2013)493).
2. Framework Regulation for Executive Agencies: Council Regulation (EC) No
58/2003 of 19 December 2002 laying down the statute for Executive Agencies to
be entrusted with certain tasks in the management of Community programmes.
3. Council Regulation (EU, Euratom) No 1311/2013 of 2 December 2013 laying
down the multiannual financial framework for the years 2014-2020 (2013). OJL
347, p. 884. < https://eur-lex.europa.eu/legal-
content/EN/TXT/?uri=CELEX %3A32013R1311>.
4. Commission Decision 2008/46/EC of 14 December 2007 setting up the
"Research Executive Agency" for the management of certain areas of the specific
Community programmes People, Capacities and Cooperation in the field of
research in application of Council Regulation (EC) No 58/2003.
5. Commission Implementing Decision of 13 December 2013 establishing the
Research Executive Agency and repealing Decision 2008/46/EC: extension of the
mandate until 2024 in 2014 (entered into force on 1 January 2015).
6. Commission Decision of 2.12.2014 establishing guidelines for the establishment
and operation of Executive Agencies financed by the general budget of the Union
(2014). <http://ec.europa.eu/transparency/regdoc/rep/3/2014/EN/3-2014-
9109-EN-F1-1.PDF>.
7. Financial statement accompanying the Commission's proposal to the legislative
authority for the Horizon 2020 Regulation stated.
8. Commission Implementing Decision of 13 December 2013 establishing the
Research Executive Agency and repealing Decision 2008/46/EC.
9. Commission Decision C(2013) 9418 of 20 December 2013 on delegating powers
to the Research Executive Agency with a view to the performance of tasks linked
to the implementation of Union programmes in the field of research and
innovation comprising, in particular, implementation of appropriations entered in
the general budget of the Union, as amended by Commission Decision C(2014)
9450, Commission Decision C(2015)8754 and C(2017)4900.

Evaluations and impact assessments and CBAs related to REA


1. EC (2013), Cost Benefit Analysis for the delegation of certain tasks regarding the
implementation of Union Programmes 2014-2020 to the Executive Agencies.
2. Specific Financial Statement related to the Decision Establishing the European
Research Council Executive Agency and repealing Decision 2008/37/EC.
3. PPMI (2016). Evaluation of the Operation of REA (2012-2015). Final Report.
4. PPMI (2018). Evaluation of the Operation of CHAFEA (2014-2016). Final Report.

Other relevant documents


1. EC (2016), 2016 European Commission Staff Survey: Analysis of the findings.
September 2016.
2. CAF, Common Assessment Framework 2013 Model. Maastricht: European
Institute of Public Administration. <
http://ec.europa.eu/eurostat/ramon/statmanuals/files/CAF_2013.pdf>
3. REA 2015-2017 Annual Activity Reports.
4. REA 2015-2018 Interim Reports.
5. REA 2015-2018 Annual Work Programmes.
6. Memorandum of Understanding between the Research Executive Agency and DG
Research and Innovation, DG Education and Culture, DG Communication
Networks, Content and Technology, DG Agriculture and Rural Development, DG
Internal Market, Industry, Entrepreneurship and SMEs, DG Migration and Home
Affairs – Modalities and Procedures of Interaction, Document Final Version dated
30/11/2015.
7. EC (2016), Analysis of the 2016 European Commission Staff survey.
118
8. IAS (2016). Audit on Human Resources Management in REA. Final Report.
9. REA C1 (2017). Staff Satisfaction Survey 2016 Action Plan.
10. REA (2017). HR Strategy.
11. REA (2018). Organisational chart of REA (as of April 2018)
<https://ec.europa.eu/info/sites/info/files/rea_organisational_chart_16_may_20
18_web.pdf>.
12. REA (2017). Strengthening Policy Feedback in REA: Mapping, Insights and
Recommendations.’
13. REA (2018). Catalogue of Policy Feedback Options.
14. REA (2018). External Communication Workplan.
15. REA (2018). Strategy for a common dissemination and exploitation of R&I data
and results 2018-2020.
16. IAS (2018). Audit on the REA’s preparedness to deliver SEDIA-related services.
Closing Note.

119
ANNEXES

Annex 1 : Key Activities Undertaken During the Evaluation

This section presents our detailed approach to the key data collection and analysis
methods to be applied in the evaluation. The methods are summarised in the table below
and include:

‒ extensive documentary review, desk research and analysis of business processes;


‒ statistical analysis of administrative and monitoring data;
‒ interview programme;
‒ in-depth study areas;
‒ Surveys;
‒ cost–benefit analysis;
‒ focus group;
‒ benchmarking.

Documentary review and desk research

During the kick-off meeting, it was agreed that the contractor was to receive monitoring
data and statistics collected by REA and the Commission, notably:

 data on REA’s financial and non-financial performance available from the Annual
Activity Reports and other documentation;

 quantitative data collected from REA’s internal systems, other sources and staff
surveys concerning their level of satisfactions;

 contact data required for the stakeholder consultation strategy.

At the end of July, the Steering Group shared with the evaluation team a series of
documents on REA’s performance (covering the financial, legal and organisational
aspects of the Agency’s operation during the evaluation period) via a Dropbox link. When
REA’s interim report covering the period 1 January to 30 June 2018 became available in
mid-August, the Steering Group forwarded it to the contractor separately. The research
team uploaded the data which they received from the Agency to PPMI’s internal server.
Over 60 files were received and placed into 9 different folders: Commission decisions and
regulations (13), IAS audit reports and related documents (9), Specific Financial
Statement, CBA studies and other financial documents (18), Memorandum of
Understanding (1), previous evaluations and follow-up action plans (8), Annual Work
Programmes (4) and Annual Activity Reports (6) and interim reports (4).

Shortly after the kick-off meeting, PPMI also asked the Steering Group to share the
CORDA data with them as soon as possible since the timely exchange of the CORDA data
are a key point for the launch of the survey programme. The contractor specified that
contact details for the following groups were to be provided:

 All Horizon 2020 applicants, including successful and unsuccessful applicants between
2015 and 2018. The contractor noted that as a minimum, they required the following
details for each H2020 proposal: proposal number, name, surname and email (due to
different rules and tools that the beneficiaries of FP7 and H2020 encountered during
their respective programming periods, it was agreed during the kick-off meeting that
the contractor would only concentrate on H2020 beneficiaries).

120
 REA’s independent experts (including evaluators and monitors) contracted between
2015 and 2018.

 EC officials from REA’s parent DGs who liaised directly with the Agency on a daily
basis between 2015 and 2018 (ideally including the DGs to which they belonged and
the programmes they were involved in). The officials’ contact details were needed for
survey C.

In response to PPMI’s request for contact data, the Steering Group transferred a set of
the CORDA data to PPMI, following which the research team assessed the quality of the
data received (i.e. contact details for the successful H2020 applicants) and identified
further data needs. The research team informed the Steering Group that nearly all the
administrative data necessary for the launch of the foreseen data collection activities
were received.

By mid-September, PPMI received a few sets of data, including:

 contact data for unsuccessful H2020 applicants;

 contact details for independent experts who were contracted by REA between 2015
and 2018.

Following the contractor’s request, EC officials also shared the contact details for EC
officials from REA’s parent DGs who liaised with the Agency during the reference period
to ensure a sufficient availability of data for the timely launch of survey C.

Following the assessment of a series of documents on REA’s performance, the evaluation


team concluded that some additional data were needed. With regards to the CBA, the
contractor sent a request to the Commission on 20 November 2018 for the data on the
actual EEA/EFTA and third countries’ contributions to REA’s administrative and
operational budget. In response, the Steering Group provided this data on
17 December 2018, thus the evaluation team made the necessary adjustments in the
report.

Lastly, the contractor requested the following data:

 staff turnover data for 2015, 2016, 2017 and 2018;

 number of staff working in the provision of central support services (incl. SEDIA) for
2015, 2016, 2017 and 2018 (in FTE);

 readiness report: IAS Audit on REA's preparedness to deliver SEDIA-related services –


Closing Note. Ares(2018)6101819 – 28/11/2018;

 CBA related to the delegation of SEDIA to REA;

 number of calls launched in 2018;

 number of proposals evaluated in 2018;

 number of grant agreements signed in 2018;

 number of running projects for H2020 and FP7 in 2018;

 number of external experts supervised remotely for 2015, 2016, 2017 and 2018;-

 operational budget (million EUR), payments for 2018;

121
 administrative budget (million EUR), without central support services, payments for
2018.

The EC officials shared the data with the contractor between the end of January 2019
and the beginning of February 2019. The final/revised data were provided for the full
year of 2018 where possible or alternatively at least for the first semester of the year.

Interview programme

In line with the proposal our interview programme involved around 46 interviews with
relevant stakeholders, including interviews with relevant EC, REA officials and other
relevant stakeholders from the CSC (31 in total) as well as the Agency’s beneficiaries (15
in total).

Regarding interviews with EC and REA officials, the evaluation team conducted around 31
group and individual interviews, including 16 interviews with EC officials and 15
interviews with REA staff between August and September as well as in January 2018. In
addition to the above, the evaluation team conducted 15 interviews with REA’s
beneficiaries in December 2018. While selecting the interviewees we maintained a good
balance between different programmes (H2020 MSCA, Societal Challenges, Industrial
Leadership, and Specific Objectives), funding instruments, and organisation types (i.e.
HEIs, public research organisations, SMEs).

Please refer to Annex 2 for a summary of key findings from the interview programme and
to Annex 2 for a list of interview questions and topics, which guided our interview
programme.

In-depth study areas

The study team carried out in-depth analysis of REA’s performance in selected areas
through four in-depth study areas. The study areas shed more light on success stories
and lessons learned from the key developments of 2015-2018 across a range of different
programmes managed by REA. The cases also structure and synthesise data from the
stakeholder consultation and other methods (e.g. desk research, CBA, benchmarking).
These synthesised findings also fed into the final report. We carried out in-depth studies
in the following areas:

 In-depth study area 1: a study on REA’s coherence, separation of tasks/roles


between the Commission and the Agency, as well as maintenance of the know-how
within the Commission.

 In-depth study area 2: effectiveness and efficiency of the provision of


administrative and logistical support activities, incl. management of independent
experts.

 In-depth study area 3: effectiveness and efficiency of the newly introduced key
business processes & efficiency gains achieved.

 In-depth study area 4: REA’s key success stories and lessons learned during 2015-
2018. This is another cross-cutting study area that summarises REA’s added value and
lessons learned, based on information provided by REA’s key “clients,” including its
beneficiaries and EC officials.

A key added value of the study areas is that they pooled together the key findings for
certain important areas of REA’s performance which might otherwise be dispersed across
different parts of the study. In particular:

122
‒ Study area 1 synthesises findings from interviews with REA/EC staff, as well as
Survey C. The inclusion of this study area allows for a detailed, question-by-
question analysis of the various aspects of the EC’s joint work activities with REA.
The main report of the evaluation took the most important findings and conclusions
from this in-depth analysis.
‒ Study area 2 includes data from desk research interviews, as well as survey B. The
study area follows a similar logic in that it provides a detailed analysis of the survey
data.
‒ Study area 3 mostly relies on data from desk research and interviews with REA/EC
officials. The study area sheds light on some key innovations implemented by the
Agency which could potentially be taken up by other Executive Agencies.
‒ Study area 4 takes information from the key success stories and areas of added
value/extra mile delivered by the Agency, leading to better results for the
programmes it is mandated to implement, as well as identifies areas where REA
was able to learn valuable lessons.

The data from around 6-8 interviews fed into each in-depth study. Several aspects of the
CBA were included in these reports. The four study areas are made public and delivered
as standalone reports, in Annex 5.

Surveys

In line with the proposal, we conducted three different surveys during the evaluation,
including:

‒ Survey A: survey of REA’s beneficiaries and unsuccessful applicants;


‒ Survey B: survey of external experts contracted by REA;
‒ Survey C: survey of EC officials in the mirror units of the Commission (DG RTD; DG
GROW; DG EAC; DG CNECT; DG AGRI; DG HOME).

Scope and sample of the surveys

The internal database of REA – CORDA – served as the main source of information for
surveying respondents across the three samples. As discussed in section 2.2.2, Survey A
was addressed to all successful applicants/beneficiaries and unsuccessful applicants for
calls launched under the 2015-2018 AWPs (not including the legacy actions of FP7).
Survey B approached all external independent experts (including evaluators and
monitoring experts) contracted by the Agency between 2015 and 2018. Survey C
targeted over 80 EC officials in the mirror units of the Commission who have worked with
the Agency during the evaluation period.

As discussed in section 2.2.2, we do not foresee an additional survey of REA staff as


our team relied on the data from the extensive survey exercise carried out in 2016 by
the Commission Services.

Production of surveys

Production of survey questionnaires was a key activity during the inception phase.
Annex 4 contains the final versions of the following questionnaires:

‒ Final questionnaire for Survey A: survey of successful Horizon 2020 applicants (with
completed and ongoing projects) as well as unsuccessful applicants;
‒ Final questionnaire for Survey B: survey of the external evaluators contracted by
REA during 2015-2018;
‒ Final questionnaire for Survey C: survey of EC officials in the parent DGs.
123
In the inception report, we emphasised that the availability of contact data are crucial
for the launch of the survey programme. In response, the Steering Group provided all
the contact details necessary for the launch of the survey programme. The contact
details for the beneficiaries and unsuccessful H2020 applicants as well as independent
experts (including evaluators and monitors) were provided to the contractor in
September for the launch of Surveys A and B. We also received the contact details for
EC officials (from REA’s parent DGs liaised with the Agency or benefited from REA’s
feedback (e.g. for policy strategy development, programme analysis, monitoring or
evaluation) during the evaluation period) in mid-October.

Data collection

We launched full surveys A and B between 9 and 10 October. Two reminders about these
campaigns were sent to the respondents, however due to technical issues with the server
at SurveyGizmo at the time of the launch of Survey A for beneficiaries, the evaluation
team decided to launch an additional reminder following the initial launch of this survey
to ensure that all the respondents were successfully approached. Survey C was launched
on 16 October. Two reminders were sent to EC officials about this survey. It is
particularly noteworthy that the alignment of the last reminder with the dissemination of
a DG RTD email sent out to the potential respondents asking them to take time to fill out
the survey, had a considerable positive effect on the response rate. All three surveys
were successfully closed on 30 October.

The response rates are summarised in Table 22. Surveys B and C, in particular received
excellent feedback. Two key reasons affected the response rates for survey A which are
lower compared to other surveys: a) over 85 % of potential respondents provided to the
contractor were H2020 MSCA contacts from the host institutions, who are often
administrative officers/managers who have no direct contact with REA; b) in H2020 each
project only has one PCOCO/key contact person who receives similar invitations on a
regular basis, thus they are fatigued.

Pre-emptive measures were also undertaken at the survey questionnaire development


(development of a clear and concise questionnaire) and respondent contacting (timely
reminders; attaching of a support letter issued by DG RTD and the specific privacy
statement to the survey invitation) stages to lower the risk of insufficiently high response
rates.

Data treatment and analysis of survey results

The data collected through all of our surveys was categorised and analysed during
evaluation. Detailed scrutiny of responses was also carried out in order to detect any
unforeseen data collection/recording issues and spot any unexpected trends that
emerged from the collected data. When the evaluation team observed that the technical
issues (i.e. server downtime at SurveyGizmo) had affected the response rate in particular
for Survey A, measures such as the launch of an additional reminder were taken to
address this challenge. The table below summarises the response and completion rates
for surveys A, B and C.

Table 22. Summary of survey response and completion rates.

Title of Number of Total Response Number of Survey


survey possible number of rate completed completion
participants responses questionnaires rate

Survey A: 6 205 652 10.5 % 592 90.7 %


beneficiaries

124
Survey A: 5 000 306 6.1 % 270 88.2 %
unsuccessful
applicants

Survey B: 6 215 2470 39.7 % 2346 94.9 %


external
experts

Survey C: EC 84 50 59.5 % 43 86.0 %


officials

Source: compiled by PPMI.

Our team applied various tools and techniques to analyse the survey data, both
quantitative and qualitative. On the basis of the survey results, we measured to what
extent the Agency’s beneficiaries and unsuccessful applicants, external experts and EC
officials were satisfied with REA’s performance. This will help the Agency to identify the
areas for improvement in the context of its organisational performance.

Data protection

The research team fully acknowledged and respected the privacy rights of individuals and
is committed to handling any personal information obtained from participants in the
survey programme or from REA and/or its parent DGs in accordance with the applicable
law and regulations, as well as ensuring protection of such data. The evaluation team
presented a clear description of the goal of the survey to persons and organisations
invited to participate in the study at the beginning of each survey. For analysis and
reporting purposes, only aggregated survey results were used. PPMI ensured that no
personal data were disclosed to third parties.

Cost–benefit analysis

Cost–benefit analysis is one of a number of techniques of economic evaluation, a type of


evaluation which provides a comparative assessment of two or more courses of action, in
this case REA as opposed to other organisational arrangements, in terms of their costs
and consequences. Costs capture the resources (such as staff) which are committed to
an activity while the consequences are the results of those activities that are of value to
the funders of the activities concerned.

For this interim evaluation, the methodology and scope of the CBA will follow the specific
requirements set in Article 3(1) of the Council Regulation (EC) No 58/2003 of
19 December 2002, in the Commission Decision of 2 December 2014 (C(2014)9109 final)
establishing guidelines for the establishment and operation of Executive Agencies
financed by the general budget of the European Union (specifically appendix II “Scope
and methodology of a Cost–benefit Analysis of the delegation of tasks to an Executive
Agency”) and in the ToR. Following the requirements of the above indicated documents,
and as explained in section Error! Reference source not found. of this report, the CBA
covers both quantitative aspects (which are addressed in the retrospective CBA) and
qualitative aspects (which are integrated into the overall evaluation framework and
evaluation questions. These aspects are summarised in the conclusions of the
retrospective CBA in section 5.2.4).

The results of the retrospective CBA (quantitative aspects) were shared and discussed
with REA during the interview programme carried out in August 2018. The updated
version of the retrospective CBA was presented to the Commission for coordination on
the 16th of October. The qualitative aspects of the retrospective CBA indicated in the ToR
and Article 3(1) of Regulation (EC) No 58/2003 were analysed in detail under Tasks 1
and 2. The key findings concerning each qualitative aspect of the CBA identified in Article
3 of Regulation 58/2003 are also summarised in section 4.2.5 of the report.

125
Focus group

The evaluation team organised an additional meeting/focus group in the framework of


meeting 3 that is foreseen in the ToR. The discussions involved participants from REA,
the Commission services and other key stakeholders of the institution or other Executive
Agencies (19 participants in total). During the focus group, the evaluation team
presented the results of the evaluation and facilitated an in-depth discussion of the
report and possible modifications to be introduced in the final version of the report. The
meeting took place in the Commission's premises in Brussels. The purpose of this focus
group was:

‒ to confirm the accuracy of the data and the analysis presented;


‒ to deepen the analysis presented in the report;
‒ to discuss future options and strategies;
‒ to discuss potential policy conclusions and recommendations.

The event was aimed at open, non-competitive exchange of views, and followed the
Chatham House Rule. The contractor kept the minutes to ensure anonymity of all the
inputs and responses in the focus group. The focus group was moderated by the Director
of our team, Vitalis Nakrosis. Mr Nakrosis has extensive experience in leading such types
of events focused on validating the research results and formulating practical policy
conclusions. The summary of the results of the focus group is presented in Annex 2 of
the report.

Benchmarking

The evaluation of REA prepared a comparison between the indicators of the Agency’s
performance during the previous evaluation study (2012-2015) and the new evaluation
study (2015-2018). We analysed, where applicable, trends in the performance of this
Agency during these two evaluation periods.

During the previous evaluation study, we compared the performance of REA during the
period 2012-2014 based on some quantitative and qualitative indicators. During the new
evaluation study, we expanded the scope of comparative analysis by including new
performance indicators and analysing performance trends over a longer period of time
(from 2012 to 2017-2018). As indicated in the Table below, a set of 20 indicators were
used in this comparative analysis. These indicators were grouped into specific types of
indicators (inputs, outputs, efficiency, cost-effectiveness, effectiveness) following our
framework for the evaluation of the Agency’s performance. Although the evaluation of
REA does not require a comparison of the performance between the Commission’s
Executive Agencies, we also proposed benchmarking these Agencies. Since most of the
indicators proposed above are also suitable for a comparison of REA’s performance with
other Executive Agencies, we used some of them for a wider benchmarking across the
Commission’s Executive Agencies.

Table 23. The indicators for comparative analysis and benchmarking.

No. Indicator Types of Possibility for


indicators benchmarking

1. Number of staff (positions actually filled) Inputs Yes

2. Number of staff for central support services Inputs No


(positions actually filled)

126
3. Operational staff (positions actually filled) Inputs Mostly no

4. Operational budget (million EUR), commitments Inputs Yes


and payments

5. Administrative budget (million EUR), Inputs Yes


commitments and payments

6. Administrative budget without support services Inputs No


(million EUR), commitments and payments

7. Number of proposals received Outputs Mostly yes

8. Number of running projects Outputs Mostly yes

9. Budget 'per head' (million EUR) (excluding Efficiency Yes


central support services)

10. Number of proposals per operational staff Efficiency Mostly yes

11. Number of running projects per operational staff Efficiency Mostly no

12. Percentage of operating costs over the Cost- Yes


operational budget (excluding central support effectiveness
services)

13. Cost savings to the EU budget (the results of Cost- Mostly yes
CBA) effectiveness

14. Achievement of key performance indicators (e.g. Effectiveness Yes


time-to-inform, time-to-grant or time-to-pay)

15. Satisfaction of applicants Effectiveness Mostly no

16. Satisfaction of beneficiaries Effectiveness Mostly yes

17. Satisfaction of experts Effectiveness Mostly no

18. Staff well-being Effectiveness Yes

19. Overall job satisfaction Effectiveness Yes

20. Staff engagement Effectiveness Yes

Source: compiled by PPMI.

Annex 6 of the Final Report presents the results of our comparative analysis and
benchmarking.

127
Annex 2: Stakeholder consultation synthesis report

The stakeholder consultation activities described in this stakeholder consultation


synthesis report were conducted in the context of the evaluation of REA. The stakeholder
consultation activities carried out for this evaluation sought to collate evidence-based
information and stakeholders’ views based on the five mandatory evaluation criteria of
the Agency (assessment of regulatory framework, mission and governance, assessment
of REA’s performance in 2015-2018, including effectiveness, efficiency and coherence, as
well as a cost–benefit analysis). The consultation sought to collect evidence and views in
order to measure the satisfaction with REA’s operations among relevant stakeholders.
The relevant stakeholders were mapped in the early stage of the evaluation and are
described below.

Consultation methods and target groups

The stakeholder consultation methods included an extensive interview programme, three


surveys, four in-depth case studies and a focus group. The contractor triangulated
different sources during the evaluation. Table 24 below provides details on the types of
stakeholders engaged for each consultation method.

Table 24. Type of stakeholder consultation and stakeholders engaged.

Type of stakeholder Type of stakeholders engaged Timing


consultation

Exploratory interviews with EC ‒ Members of the Steering Committee Q1


and Agency officials as well as ‒ Selected EC officials in the parent DGs/mirror units 2018
high level officials
‒ The Director of REA & heads of departments
‒ Heads of units/representatives of the operational
units
‒ Heads of units/representatives of Administrative
Department
‒ Representatives of Staff Committee, other
employees

Survey of REA’s beneficiaries, ‒ REA’s beneficiaries and unsuccessful applicants Q2


contractors, applicants, who applied for/received funding under the 2015- 2018
external experts and EC 2018 AWPs
officials in REA’s mirror units in ‒ All external experts contracted by the Agency
the Commission between 2015 and 2018
‒ EC officials in REA’s mirror units in the Commission
with whom the Agency staff liaised during the
evaluation period

Interviews with REA’s ‒ Beneficiaries of different programmes (including Q2


beneficiaries H2020, MSCA, Societal Challenges, Industrial 2018
Leadership, and Specific Objectives), funding
instruments, and organisation types (i.e. HEIs,
public research organisations, SMEs)

Focus group ‒ Members of the Steering Committee Q3


‒ Agencies managerial staff 2018
‒ Parent DGs including DG RTD, DG GROW, DG EAC,
DG CNECT, DG AGRI and DG HOME

128
Summary of the survey results

The surveys were launched in October 2018 and were active for around three weeks. The
research team implemented three surveys: survey A, survey B and survey C. The design
of survey A allowed us to differentiate between the Agency’s beneficiaries and applicants,
while survey B was presented to external experts and survey C was dedicated to EC
officials. The survey questions were carefully crafted to ensure comparability across
groups of respondents and complementarity with the other consultation methods,
including the interviews. The total number of invitations sent and answers received are
presented in Annex 1. The survey data fed into all the evaluation questions, in particular
the aspects of these questions in regard to which opinions of respondents are of prime
importance. All evidence from the surveys was incorporated into the report of the study,
taking into account the representativeness of the responses. Table 25 below presents a
summary of the findings from the survey data.

Table 25. Summary of the survey results.

Evaluation Summary of the survey results


tasks

Task 2.1. The overall satisfaction with the Agency’s performance:


Effectiveness
The survey programme showed that a majority of stakeholders were satisfied with
REA’s performance between 2015 and 2018. The satisfaction with the Agency’s
performance was higher among REA’s beneficiaries (86 %) than among EC
officials (79 %) and unsuccessful applicants (55 %). Most beneficiaries of the
Agency further indicated that they would certainly or possibly consider applying
for REA’s calls for proposals again in the future. This sentiment was indicated by a
similar share of REA’s independent experts.
The survey of beneficiaries:
According to our survey of beneficiaries, there were four key channels used to
learn about the EU research grants: recommendations by colleagues or superiors
(21 %), Research Participant Portal (17 %), European Commission website such
as FP7/H2020 portal, REA website, CORDIS (17 %) and the National Contact Point
(13 %). REA’s beneficiaries were largely satisfied with the way the Agency
communicated with them during various phases of the project life-cycle. Of the
respondents, 81 % strongly or rather agreed that the information from REA
concerning administrative requirements was clear. Only 66 % reported that they
knew who to contact for any question(s) they had or where to get help when
submitting their application. Even fewer of the beneficiaries (61 %) agreed that
they knew who to contact for any question(s) they had or where to get help when
preparing their application.
Of the respondents to the beneficiaries’ survey 81 % strongly or rather agreed
that REA staff assigned to their project were easily available and responsive during
the implementation. Most favourable results were received in relation to REA’s
availability and responsiveness during the grant amendment (82 % strongly or
rather agreed) and in the grant finalisation phases (85 % strongly or rather
agreed). About 74 % of the respondents claimed that the feedback received on
the scientific and technological progress in their project was useful. When it comes
to the effectiveness of communication channels used by REA to provide
beneficiaries with relevant information, email was the preferred tool of the
respondents to the beneficiaries’ survey. Altogether 92 % of them said that this
channel was useful to a large or at least to a moderate extent. About 57 % of the
surveyed REA’s beneficiaries thought that telephone contact was another key
communication channel which, to a large or at least moderate extent, provided
them with relevant, helpful information when they need it. REA’s website and
face-to-face contacts were also seen as quite useful. On the other hand, a large
majority of beneficiaries doubted the relevance of live web and recorded video
briefings as well as REA’s Facebook profile. Some respondents noted that they also
relied on the communication portal within the Participant Portal or their local
grants office to obtain relevant information.

129
Concerning visibility, a large majority of beneficiaries related the grants and
tenders managed by REA both to the EU budget (95 %) and to the European
Commission as an institution (84 %). Over 80 % of the respondents strongly or
rather agreed with the view that the programmes managed by REA were well
advertised.
The survey of external experts:
REA effectively informed its network about the opportunity to work in the capacity
of an external expert. As many as 80 % of the surveyed independent experts
believed that the information on how to become an external expert was easily
available; 12 % of stakeholders were neutral on this issue. Recommendation by
colleagues and superiors in their institutions was mentioned by 26 % of the
surveyed external experts as the key channel to learn about the opportunity to
become an expert. Other key channels included the European Commission website
(19 %), participation in another research project supported by FP7/H2020 (18 %)
and relevant national sources such as research ministry, EU liaison office at their
university, etc. (15 %). External experts were very satisfied with the information
provided to them online by REA. The H2020 online manual for experts was seen as
especially useful by 94 % of external experts responding to the survey. FAQ and
other reference documents were also considered very to fairly useful by
respectively 93 % and 89 % of the respondents.
External experts who worked with REA during the evaluation period were largely
satisfied with the Agency’s communication during every stage of their assignment.
It is notable that over 95 % of external experts strongly or rather agreed that REA
staff with whom they worked were responsive and provided them with answers to
their questions. Around 90 % of the external experts were also positive about the
clarity of tasks and procedures, who to contact with questions, etc.
The survey of EC officials:
Nearly 84 % of the surveyed EC officials agreed that REA effectively and efficiently
implemented the programmes delegated to it under their portfolio of activities. In
terms of the balance between policymaking and programme implementation tasks
carried out by the Commission and the Agency, around 58 % of EC officials, who
responded to our survey, stated that REA enabled them to focus entirely on their
policymaking tasks. Over a third of the respondents felt that they were (to some
extent) also involved in activities delegated to REA which could be regarded as
programme implementation tasks.
Only 38 % of the surveyed EC officials generally agreed that the Agency provided
them with sufficient policy feedback to inform their policymaking tasks.
Out of nearly 60 % of the surveyed Commission officials, who stated that they
liaised with REA staff during the preparation of the work programmes and/or
research topics in their areas of responsibility during 2015-2018, over 90 % of the
respondents agreed that the inputs provided by REA were timely and of high
quality to a large or moderate extent. While 74 % of the surveyed EC officials
thought that REA provided them with relevant inputs for the programme
priorities/research topics under their or their unit’s responsibilities, less than 60 %
of the surveyed Commission officials noted that the inputs provided by REA were
directly used in the implementation of the preparation of the work programme or
research topics.
Over 90 % of the surveyed EC officials felt that they had a good working
relationship with their REA counterpart(s) at interpersonal level. On the other
hand, about 60 % of the surveyed EC officials claimed that REA was proactive in
its regular communication with them, while a quarter of the officials thought that
the Agency was not proactive enough in their regular contact with counterparts in
the relevant parent DGs.
The majority of EC officials (76 %) reported that they felt insufficiently informed to
a larger or moderate extent, nevertheless a substantial group of 20 % felt only to
a little extent or not at all sufficiently informed about the progress of evaluations
in their programmes and/or research topics. The Commission officials reported
that more of REA’s efforts should be directed at providing qualitative information
on the evaluation process (e.g. on the reasons for rejecting certain proposals).
About 85 % of the surveyed EC officials reported that they provided briefings to
REA staff before the start of the evaluation activities and 83 % of the surveyed EC
officials thought that REA staff were attentive to the briefings. In addition, 59 % of
the respondents stated that the Agency staff were active in the briefing sessions.
130
It should be noted that only 29 % of EC officials noted that the Agency staff took
the briefings into account to a large extent while organising evaluations.
The Survey C respondents were also presented with a question about the extent
to which REA implemented a process which ensured that the proposals best
addressing the specific research topics under their areas of responsibility were
selected for funding, as set out in the H2020 Rules For Participation and further
detailed in the Vademecum. Around 55 % of the respondents (N=21) stated that
the Agency always implemented a process which ensured that the proposals best
addressing the specific research topics under their areas of responsibility were
selected for funding. Another 37 % of the respondents (N=14) thought that the
Agency was able to ensure that the best proposals were selected for funding most
of the time or sometimes. However, our survey results revealed that around 8 %
of EC officials (N=3) disagreed that REA implemented a process which ensured
that the proposals best addressing the specific research topics under their areas of
responsibility were selected for funding. As a result, they were concerned about
the profiles and expertise of some of the contracted experts which did not seem to
fit the specificities of the research projects or align with the Commission’s policy
and vision. Some policy officials at the parent DGs thought that they were not
systematically consulted on the pool of experts, and that their suggestions were
not always taken into account by REA.

Task 2.2. The survey of beneficiaries:


Efficiency
Respondents to the beneficiaries’ survey were generally positive about cooperation
with REA during all the project life-cycle stages, including the application, grant
finalisation/contracting, project/contract implementation, grant amendment,
reporting and payment phases.
In the context of the evaluation and selection phase, the majority of REA’s
beneficiaries were satisfied with the performance of the Agency related to
timeliness of the evaluation and selection of proposals, a similar share were
satisfied with the timeliness of contracting and with the overall time period
between the call deadline and signature of grant agreement. The results of the
beneficiaries’ survey show that 85 % of respondents agreed that information for
applicants was easy to find and around 84 % claimed that it was clear. Around 74
% of respondents to the beneficiaries’ survey agreed that the requirements for the
application process were reasonable and proportionate. Also, a slightly higher
share of beneficiaries (77 %) agreed that the proposal templates were well
structured and easy to follow. Satisfaction with the user-friendliness of the
electronic systems used for submitting the applications reached 71 % among
beneficiaries.
Overall satisfaction with the transparency of the evaluation process among
beneficiaries reached 81 %. Altogether 87 % of respondents to the beneficiaries’
survey agreed that the outcome of the evaluation of their application was easy to
access via the Participant Portal. Concerning the quality of feedback provided on
the evaluation results, 80 % of respondents to the beneficiaries’ survey agreed
that the individual reviews and panel comments provided were clear and 77 %
that these comments were useful in understanding the strengths and weaknesses
of the proposal.
In terms of the grant finalisation phase, the REA beneficiaries’ survey revealed
that 87 % of the respondents were satisfied with the overall grant conclusion
process. Satisfaction with the clarity of instructions provided by REA at the
beginning of the granting process reached 80 %. The same 80 % of respondents
agreed that the requests from REA (e.g. for proposal modification or providing
missing information) were clear. The lowest level of satisfactions were related to
the user-friendliness of the IT tools used in the grant conclusion stage (66 %)
Very good performance of REA in terms of timely processing of payments was also
reflected in the results of the beneficiaries’ survey – 94 % of the beneficiaries
were satisfied with the time period it took REA to make pre-financing payment and
89 % of them with the time period it took REA to process interim and final
payments. Although REA was complying well with the contractual TTA targets, a
moderate 68 % of beneficiaries were satisfied with the time period it took REA to
process grant amendment requests. A similar 67 % of beneficiaries were satisfied
with the fluency of the overall grant amendment process; a higher level of
satisfaction (76 %) related to clarity of information and advice provided by REA
during the amendment process. A number of respondents to the beneficiaries’

131
survey claimed that the grant amendment procedure was too heavy and time
consuming, especially for small grant amendments.
The results of the REA beneficiaries’ survey revealed that beneficiaries were
generally satisfied with most aspects related to reporting and monitoring of
grants. As many as 78 % of the respondents to the beneficiaries’ survey agreed
that technical and financial reporting requirements were clear and a similar share
of them that process of project monitoring by REA was smooth (76 %) and
transparent (78 %). Also, 75 % of the respondents agreed that feedback received
from REA on project progress was useful. Also, over 80 % of the respondents to
the beneficiaries’ survey agreed that the periodic reporting requirements were
appropriate to the level of activities in their project. The lowest level of satisfaction
(62 %) was related to the user-friendliness of the IT tools employed for
application and grant management.
With respect to the RES, REA’s beneficiaries thought that the Agency provided
high quality services. Around 93 % of the beneficiaries that used the services
found the responses received very helpful or helpful. On the other hand, while REA
provided a high-quality service, the survey responses imply that a significant
number of REA’s beneficiaries (51 %) were unaware of the possibility to use the
service.
The survey of unsuccessful applicants:
The level of satisfaction with various aspects of REA’s performance among the
unsuccessful applicants was generally lower compared to the beneficiaries’ survey.
However, it remained rather high – 75 % of respondents agreed (another 13 %
being neutral) that information for applicants was easy to find and 70 % (another
12 % being neutral) that it was clear. Around 59 % of respondents agreed
(another 18 % being neutral) that the requirements for the application process
were reasonable and proportionate and 64 % of them agreed (another 20 % being
neutral) that the proposal templates were well structured and easy to follow.
Satisfaction with the user-friendliness of the electronic systems used for
submitting the applications reached 68 % among applicants (another 18 % being
neutral).
The survey of unsuccessful applicants showed that the majority of them were
satisfied with the performance of the Agency related to timeliness of evaluation
and selection of proposals. However, the results of REA’s unsuccessful applicants’
survey also revealed that they were substantially less satisfied with the quality of
the evaluation process and feedback received compared to the beneficiaries.
Around 40 % of the unsuccessful applicants were satisfied with the overall
transparency of the evaluation process and 43 % agreed that the individual
reviews and panel comments provided were clear (32 % though that these
comments were useful in understanding the strengths and weaknesses of their
proposal).
A number of applicants said that they would like to get better feedback on why
their applications failed (clearer, more substantial argumentation on the score
allocated to the proposal, key weaknesses, etc.). Also, some applicants felt that
reviewers haven’t understood their proposal, comments provided were arbitrary,
too general, vague or even contradicting and grading was disproportionate to the
weight of the weaknesses; a number of applicants claimed that a re-submitted
proposal was given a lower score despite the improvements made following
comments received under a previous evaluation. Some respondents referred to
the scores of above 90 % or even 95 % for a proposal not being funded, where
the gap between selected for funding and rejected applications was very narrow (a
‘lottery’ element of the evaluation and selection). Some applicants would prefer a
two-stage application process due to high workload level related to the
preparation of the application and the low success rates. Also, some applicants
would appreciate more information on the overall results of the calls (which would
allow them to make an informed decision on whether to participate in future calls),
more specific information on their inclusion on a reserve list (e.g. position or time
frame for making decisions on the reserve list projects), etc.
The survey of external experts:
In terms of the registration process, almost 88 % of survey respondents strongly
or rather agreed that the registration process was smooth and this number has
increased since 2014, where 83 % of survey respondents strongly or rather
agreed to this statement. The eligibility requirements and the easiness to find
information on how to become an independent expert were also assessed

132
positively. On the other hand, the findings were less positive in terms of the
transparency of the selection procedures. Nearly 57 % assessed this aspect of the
registration process positively, while over 20 % of survey respondents strongly or
rather disagreed to the related statement. Further analysis of the survey results
revealed that the recommendation by colleagues or superiors in one’s institution
(37 %), EC website (27 %), participation in the research project supported by
FP7/H2020 (26 %) and relevant national sources (21 %) were key sources of
information about the opportunity to become an independent expert of REA.
Survey data show that the majority of experts (typically between 86-95 %
depending on the specific question asked) had a positive experience and were
satisfied with the support provided and quality of the services received via REA’s
Legal Entity/Bank Account Validation service. 71 % of experts also strongly or
rather agreed that the email notification that they received asking to provide one’s
identity and bank account details was clear.
Regarding selection and contracting of experts, almost 94 % of survey
respondents were contacted with regard to their availability to work as an
independent expert for a particular evaluation or project before they received an
expert contract to sign. Overall, the selection and contracting process was
assessed very positively, as was the issue time of the contract (about 90 % of
survey respondents strongly or rather agreed with these statements).
Although the feedback about REA’s assistance to the experts during performance
of the assignments and tasks was largely positive, the responses of experts
showed lower satisfaction in aspects related to the online evaluation system used,
the quality of the templates used for the reports, as well as the time available for
drafting the reports.
The survey of external experts also revealed that the number of independent
experts, who strongly or rather agreed with the way that their payments were
handled was 92 %. The number of experts who disagreed that the number of days
they were paid for their remote work matched the effort they actually spent on
their tasks was 23 %. The additional feedback to an open-ended survey question
revealed that a significant share of the experts felt they had spent more time on
their tasks than what was compensated for and that their daily allowance is too
low . Nearly 91 % of the experts would certainly, and another 8 % possibly, want
to work with the Agency in the future).
The survey of EC officials:
The EC officials were very satisfied with the timeliness of REA’s evaluation
process. A slightly lower level of satisfaction related to the quality and
transparency of the evaluation process, with the main criticism related to
REA’s performance in ensuring competence of the expert evaluators.

Task 2.3 The survey of EC officials found there to be rather high overall level of
Coherence satisfactions with the frequency and quality of the specific outputs produced.
Based on the Survey C results, the key policy feedback activities and outputs
produced by REA for the Commission included Reporting to Programme Committee
and/or Advisory Group (69 %), coordination meetings with Project Officer/Policy
Officer (67 %) and Inputs from REA projects for EC communication and
dissemination (64 %). Among the different response options, EC officials ranked
the frequency (84 %) and quality (87 %) of policy feedback output most
favourably. Less than 65 % of the respondents, however, indicated that they have
used these policy outputs to inform their policymaking tasks. In terms of the
coordination meetings with Project Officer/Policy Officer, the levels of satisfaction
among EC officials with the frequency (73 %) and quality (76 %) of policy
feedback outputs were also high. Around 70 % of EC officials stated that they
have used these policy feedback outputs to inform their policymaking tasks.
Regarding input on REA projects for EC communication and dissemination, the
respondents were satisfied with the frequency (69 %) but they were slightly less
satisfied with the quality (61 %) of policy feedback output provided. Around 64 %
of respondents indicated that they have used this policy feedback output to inform
their policymaking tasks.
According to EC officials surveyed, Collecting and giving feedback about
researchers’ needs and satisfaction (22 %), Innovation Radar (33 %) and
Contributions from REA to the preparation of policy reports (38 %) were among
REA’s policy feedback outputs that Commission officials were least aware of, and

133
claimed not to have received. Nevertheless, among EC officials surveyed who
stated that they have received these policy feedback outputs, there were high
levels of satisfaction with their frequency and quality. Moreover, a significant
share of these respondents stated that they have used these policy feedback
outputs to inform their policymaking tasks. For instance, about 70 % of
respondents were satisfied with the frequency of policy feedback output and 66 %
were positive about their quality. About 60 % of these respondents revealed that
they have used this policy feedback output to inform their policymaking tasks. The
results regarding the frequency and quality, as well as the utility of the Innovation
Radar and contributions from REA to the preparation of policy reports were
similar.
The extent of take-up of policy outputs has varied considerably between DGs and
types of policy feedback outputs produced. The survey of EC officials revealed that
some policy feedback outputs produced by REA were rarely used, even if their
overall quality was satisfactory. For instance, about 53 % of EC officials indicated
that they have frequently received written feedback of high quality from REA on
WP implementation. However, only half of these respondents had used this policy
feedback output to inform their policymaking tasks. This suggests that the Agency
was in some cases using its resources to produce policy outputs that were of
limited value to the Commission.
The respondents of Survey C suggested several missing policy feedback outputs
as well as ways to improve the format of the existing ones. With regards to the
missing policy feedback outputs, some EC officials noted that they would benefit
from cluster analysis of portfolios, more information about the nature of multi-
disciplinary projects that are linked, feedback on the quality of project outputs,
and thematic analysis at a project level. In addition, open questions in Survey C
revealed that the respondents suggested some specific ways in which the
provision of policy feedback outputs could be strengthened, for instance with:
 Adoption of two-way communication flows: more systematic briefings of POs in REA
on the ongoing policy discussions overall and within their parent DG (to facilitate
extraction of project-level information, research results and useful feedback);

 Provision of training and capacity-building support for REA staff (to help develop
their understanding of the thematic research and policy breadth of the entire
H2020 programme, not only the sub-programme they are working on);

 Allocation of a thematic portfolio for REA staff to build a stronger relationship with
their colleagues who are responsible for a particular policy area (i.e. a policy
mirroring approach) in the parent DG. This in turn could help to strengthen the
quality of policy feedback. It may also have a positive motivational effect for REA
staff since they could take greater ownership of particular themes. A stronger
thematic portfolio approach could also help to make linkages between projects that
contribute to the same theme.

 More systematic integration of the parent DGs’ policy feedback needs into the
policy feedback mechanism: policy officials from the parent DG should be
systematically informed about projects relevant to their policy area, including
timely information about the dates for kick-off/mid-term and review meetings.

 Improvements with regards to the access of parent DGs to the projects managed
by REA.

 Some parent DGs expressed the view that more direct contact with beneficiaries
would improve their knowledge about programme implementation and the results
being achieved through project-level research activities.

 Establishment of shared folders/common IT platform with project-related


information.

 Development of minimum mandatory requirements for policy-relevant information


from projects such as the provision of 'Policy Rounds' or 'Policy Papers',

 Hiring of a professional ‘programme analyst.’

Some of the surveyed EC officials, however, shared concerns regarding the

134
practical implementation of these measures. These concerns related to the
specialisation of human resources and the lack of training and a need for capacity-
building for Agency staff and EC officials.

Summary of the findings from the interview programme

The interview programme encompassed high-level and exploratory interviews with EC


and Agency officials as well as in-depth study area-related interviews with REA’s
beneficiaries. In total, the programme involved nearly 46 interviews:
 31 interviews in the context of exploratory interviews carried out during the first
and second interim phases of the project; and
 15 interviews related to in-depth study areas conducted during the final phase of
the project.

The interview programme involved the Agency’s key beneficiaries. The interview
programme was used either to supplement other sources of evidence or to gain insider
insights in cases when other data sources were scarce. The in-depth study areas aimed
to explore the effectiveness, efficiency and coherence of REA’s activities implemented
over the evaluation period in-depth. In addition, special focus was given to assessing the
success stories and lessons learned of the Agency during 2015-2018. Following the field
visits to Brussels in August/September 2018, our team collected data and findings for all
four in-depth study areas. The findings from 15 interviews with the Agency’s beneficiaries
were integrated primarily into cases 3 and 4. Each study area benefited from
approximately 6-8 interviews. Please refer to Annex 5 which presents the four in-depth
study areas and Table 27 below for a summary of the results of the interview
programme.

Table 26. Summary of interview results.

Summary of the interview results

Task 2.1. Effectiveness

Interviews with the Agency and DG officials


Most of the interviewed Agency and DG officials reported that the Agency managed its tasks well.
They reported that the existing communication mechanisms and other collaboration tools ensured
that the collaboration did not involve any micro-management by the Agency’s parent DGs. The
respondents have also confirmed that REA collaborated closely with the Commission by providing
technical/specialised contributions in relation to strategic planning for Horizon Europe, drafting of
Horizon Europe indicators and the specific programmes for Horizon Europe. In addition, REA
supported programme design and implementation by closely collaborating with all other members of
the R&I Family.
With regards to policy feedback, interviews with EC officials suggested that there were some specific
needs of the EC that were not fully met as they would have preferred more customised policy
outputs tailored to their needs.
Some policy officials at the parent DGs thought that they were not systematically consulted on the
pool of experts, and that their suggestions were not always taken into account by REA. The
interviews with both the Agency and the Commission demonstrated that there have been cases
where, due to various reasons, the parent DGs could not contribute effectively to the process of
expert selection. Some notable examples of the reasons for this included the lack of resources for
the analysis of a lengthy list of experts shared by REA with the policy officers or the lack of their
awareness about the procedures allowing them to contribute to the process.
The interview programme demonstrated that some EC officials lacked awareness about the
provisions of the MoU which allowed them to participate in the briefings on policy-relevant aspects to

135
experts or attend panel review meetings as observers.
Another issue that emerged during the interview programme was related to the lack of feedback on
the progress of ongoing projects and early results/lessons learned for the policy implementation
process. Although the collaboration between REA and its parent DGs was not based on the clearly
formulated needs of the parent DGs, the Agency compiled and provided the parent DGs with up-to-
date information on project main events including regular project and review meetings through the
shared calendars in accordance to the provisions of the MoU. While project events were less relevant
for the policy officials working with the bottom-up programmes, such as MSCA, SEWP and FET-Open,
they were highly relevant for those who worked with the top-down programmes. Due to the varying
needs concerning the participation of the parent DGs in project-monitoring activities, REA recognised
a need to formulate specific business processes in this area at the unit/parent DG level.
Throughout the interviews, EC officials in nearly all parent DGs suggested considering how certain
working arrangements could be modified to provide REA and its parent DGs with more effectiveness
and flexibility in relation to the implementation of the programmes delegated to the Agency. For
instance, several EC officials recognised an increasing demand for the Agency to assume the
implementation of the remaining actions with policy interest to avoid inefficiencies in maintaining a
full project management capacity at the EC for a fairly small portfolio of projects. Both the Agency
and Commission officials recognised that this arrangement would require both sides to formalise a
new set of implementation modalities for these actions to allow the Commission to be closely
involved in monitoring of such projects (due to their high significance to the Commission’s policy
objectives and reputation). At the same time, REA’s primary role in the administrative
implementation of these new activities would have to be maintained along with the Agency’s overall
responsibility as an Authorising Officer.
In addition, the Agency staff revealed that they have experienced some challenges in relation to the
implementation of a cross-cutting call in the area of Blue Growth under Societal Challenge 2 due to
the fact that the budget for this call was to be pooled together from different parts of the framework
programme, managed by three different EAs, i.e. REA, EASME and INEA. It is expected, however,
that the additional degree of flexibility envisaged under the next EU Framework Programme for
Research and Innovation will contribute to the efficiency gains and economies of scale achieved by
the Agency in the implementation of cross-cutting calls.
Several interviewed EC officials (e.g. in DG GROW) suggested that REA could implement additional
types of funding instruments such as scientific prizes or tenders, similar to what is currently being
implemented by some other Executive Agencies, e.g. EASME and CHAFEA. The interviewed REA
officials were generally receptive to this idea; they noted, however, that so far there has been no
demand for REA to implement these types of actions and sufficient scale and number of the new
types of actions would need to be implemented to fully exploit the efficiency gains and economies of
scale that REA can provide.
Most of the interviewed EC officials recognised that the Agency was effective in identifying potential
and outstanding success stories to be showcased during events or published on
websites/Commission magazines. However, some officials from one parent DG noted throughout the
interviews that, on a few occasions, REA prioritised administrative excellence of the projects over the
content while identifying and selecting successful projects for communication purposes.
Nevertheless, REA continuously devoted efforts to improve its capacity to identify and select
successful project for communication purposes.

Task 2.2. Efficiency

Interviews with the Agency and DG officials:


The interview programme with REA staff revealed that the Agency made constant efforts to improve
the quality of the evaluation process. For instance, one of the improvements introduced was the tool
allowing the pre-allocation to be checked and updated manually. Also, REA took specific measures to
automate and improve the detection of possible conflicts of interest for experts. Additionally, the
development of the IT tools and electronic workflows contributed to an improved delivery of
programme management functions and a growing beneficiaries’ level of satisfaction with most grant
management processes. Moreover, introduction of remote evaluations also brought significant
improvements to the Agency.
It was also specified during the interview programme that with the adoption of various HR
management measures, including the workload management exercise, REA was able to cope well
with a much higher workload level than was foreseen in the 2013 CBA. As a result of the actions
undertaken, REA staff stated that no cases of business continuity challenges occurred. However,
interviews with the Agency staff also revealed that other factors such as differences in complexity
and level of interaction with stakeholders (e.g. implementation difficulties and conflicts with
beneficiaries associated with legacy actions) limited the capacity of certain project officers to take

136
advantage of the workload management exercise. It was, however, also noted that when newcomers
joined REA, the exercise allowed the Agency to readjust the number of staff more easily based on
the actual evolution of work.
Our interviews also suggested that its HR planning and staff recruitment processes ran smoothly.
The feedback from the interviews with REA staff revealed that the recruitment of scientific profiles
with a strong expertise in project management has been improved significantly since the opening of
the permanent call for expression of interest in EPSO. The remaining challenges to HR planning and
recruitment were linked to the shortened length of the recruitment procedure, the limited resources
available for the implementation of the recruitment procedure, and the technical issues which
continuously affect it. However, some interviews with the Agency staff during the interview
programme in August 2018 raised questions about the extent to which the revisions made to the
selection procedure for CAs advanced the recruitment process. For instance, although the new
selection procedures conducted through a central market allowed CAs employed at REA to compete
for higher-function groups, some of the Agency staff were concerned about the possibility of losing
their positions at the Agency to the external candidates.
Interviews with the Agency’s beneficiaries:
More than half of the interviewed REA beneficiaries indicated that the proposal submission stage was
clear, especially in terms of how to write and submit the application. However, some respondents
recommended the Agency to simplify the proposal submission stage, for example, by introducing a
proposal template, or by shortening and simplifying the guiding form for the proposal. Survey A also
revealed that a significant share of the beneficiaries admitted that they did not know whether REA
could be contacted during the application phase, thus the majority of beneficiaries relied on the
service of National Contact Points during the application phase.
Almost all interviewed beneficiaries agreed that the time, which REA took to decide on their
application was reasonable (in most cases it was no more than several months). About half of the
interviewed REA beneficiaries agreed that the feedback provided by the Agency on their proposals
was useful. However, some respondents noticed that the feedback could be broader.
More than half of the interviewed REA beneficiaries admitted that even though the contract
preparation phase was long and tense, for all beneficiaries it still remained straightforward and
successful. It is noteworthy that some interviewees still encountered some challenges during this
phase. Some of the issues related to lack of awareness of the project officer responsibilities among
the beneficiaries.
During the project implementation phase, respondents usually contacted project officers when they
needed specific information, for example, regarding reporting, finances, and grant amendments.
While some interviewees recognised that their project officers were quick to respond, more than half
of the interviewed beneficiaries admitted that it usually took their project officers several weeks to
respond to their queries/requests. According to the interviewees, it was particularly difficult to
receive a prompt response from the Agency with regards to finance. Some of these cases resulted in
the delays of delivery dates for the key deliverables/activities. Moreover, sometimes the responses
received were unclear or inaccurate
More than half of the respondents were satisfied with REA’s efficiency, however, some of the
interviewees emphasised that REA needs to increase its capacity, i.e. the Agency needs to hire more
experienced staff, as project officers change often during the project. Some of REA’s beneficiaries
noticed that the project officers experienced high workload during the evaluation period, thus it was
difficult to reach them. Furthermore, some beneficiaries noted that their project officer was not
interested in the content of their individual projects. Although there were some beneficiaries who
noted that their project officers seemed interested and competent to deal with the progress of their
project content, their main responsibilities lay with administrative/managerial tasks thus they could
not evaluate/analyse carefully the deliverables associated with the projects under their portfolios.
Beneficiaries reported that the Agency staff thus had to rely on the external monitors to provide
comments on the project deliverables.
Some of the beneficiaries claimed that the comments received from the monitors lacked coherence
with the foreseen activities of the projects and timeline, or said that while the selected monitors may
be established academics within their fields, they were not policy experts who could advise them on
how to effectively reach policymakers (especially in cases where the goal of the project is impact and
they expect to interact with the policymakers). Thus, several beneficiaries noted that they would find
it helpful if REA had more staff and internal monitors who could evaluate their project instead of
relying on external monitors whose suggestions and criticism were sometimes misplaced. However,
they argued that if monitors were to remain, they should not replace the Commission policy advisers
according to some of the interviewed beneficiaries. Some of the key suggestions from these
respondents related to the establishment of measures to link the project results and the policy
possibly by arranging direct discussion with beneficiaries and the Commission at least at the end of
the project to assess how the results of their projects can be used in the policymaking tasks. Given
that the EU projects have potential to make a big impact due to their large scope and budget, some

137
of the beneficiaries thought the Commission should also dedicate more efforts to becoming involved
in the project management process to ensure that they are able to shape these projects to ensure
policy impact.
The majority of interviewees found their contract conditions clear. However, a couple of the
respondents suggested that the contracts could be simplified and shortened (e.g. following the
approach used in FP7 where contracts were signed with separate annexes rather than long complex
texts in the main part of the contract). According to some interviewed beneficiaries, REA could also
better establish roles and responsibilities in the grant agreement and have a protocol for conflict
resolution.
REA’s beneficiaries were generally satisfied with the Agency’s IT tools, except for small scale bugs
which affected the use of them. The interviewed beneficiaries were in most cases able to resolve the
IT issues effectively with support from REA’s IT helpdesk. Some of the suggested areas for further
improvements included, for instance, the functionality of the Participant Portal. One suggestion was
to update the Portal with a functionality which would allow beneficiaries to transfer the proposal to
the portal automatically (in particular all the information about milestones, deliverables and WPs).
Some beneficiaries also noted that they would benefit from a functionality allowing to fix deliverables
on the Portal (especially in terms of typos or small changes) without having to ask the project officer
to reject deliverables and then resubmit them again.
Even though the majority of respondents were satisfied with the reporting requirements, they had
some suggestions on the ways the requirements could be improved. It was suggested that reporting
requirements should be more output based rather process based (e.g. time-sheets, person months),
which now make the processes complicated and do not necessarily reflect the real impacts of the
projects.
Most of the interviewees were successful in meeting deadlines set by REA and receiving the
necessary support and coordination from their project officers regarding the deadlines when needed.
Most of the interviewees have not experienced any issues regarding payments. In some cases,
however, payments were delayed for several months. For example, some financial delays occurred
due to the holiday season in the summer or when the IT system could not handle the updates on the
financial/scientific part at the same time.

Task 2.3. Coherence

Interviews with the Agency and DG officials:


In the context of REA’s portfolio of activities, REA management confirmed through the interview
programme that overall, while REA’s programme portfolio is thematically diverse, it is coherent and
consistent. It was noted however that there are some specificities within the Marie Skłodowska-Curie
Actions (MSCA), corresponding to the different types of activities supported, compared to the rest of
H2020.
Another area of concern for the Agency officials was related to the development of the IT tools by
the CSC. The IT tools were generally viewed as being too generic and insufficiently customised to
meet REA’s needs as they differed depending on the H2020 sub-programme, reflecting external
pressures to meet the needs and expectations of different parent DGs. There has consequently been
a need to work closely together with the CSC and with DG DIGIT to meet the needs of different DGs
and to articulate the need for the customisation of IT tools to the CSC. This was seen as having
created a significant additional workload for REA. More positively, however, there was also feedback
that some corporate IT tools developed have led to efficiency savings, especially through the
transition to electronic reporting in H2020.
A general observation by interviewees from REA was that the IT providers they work with are not
always close enough to the business processes REA is responsible for and do not understand the
specific needs underlying these processes. Notwithstanding, the CSC pointed out that the
governance structure has been specifically designed to ensure the participation of all implementing
bodies in the decision-making process. This set-up permits the concerns of different implementing
bodies to be officially addressed. Nevertheless, interviewees from REA believe that there could be a
process of working together even more closely. A question mark was also raised as to whether the
overall timings for the development of IT tools (e.g. ORIS-OZIRIS) were appropriate, since some
tools will only be developed and fully implemented by the end of 2019. A number of processes still
remain to be implemented in the COMPASS/Sygma grants management tools. A particular problem
identified was that some IT tools were down for several weeks, which had an impact on REA’s
productivity. In future, in order to overcome such difficulties, there is a need for close collaboration
between REA and its IT providers.
Despite the stream of policy feedback provided by the Agency to the parent DGs, many interviewed

138
EC officials reported gaps in the area. This was also confirmed by the interviewed EC officials from
the multiple parent DGs who recognised that due to the lack of resources they have not always been
able to analyse and utilise the policy outputs provided to them by REA.
Regarding the governance arrangements relating to policy feedback provision, it was acknowledged
by stakeholders interviewed that governance arrangements relating to REA are complex, reflecting
the fact that the Agency has six parent DGs. The parent DGs stated throughout the interviews that
they have received adequate information flows about REA’s organisational performance in delivering
on its delegated programme management and implementation remit through the AAR, which
provided the primary reporting tool for communicating information to parent DGs and wider
stakeholders about key achievements pertaining to REA’s operational performance.
It was noted throughout the interview programme that REA has already begun to develop innovative
ways of identifying policy-relevant research results across groups of projects, in particular by
fostering a thematic clustering approach. Thematic cluster meetings have been designed as a means
of discussing research results across thematically linked projects so as to be able to derive policy
lessons. Interview feedback and the results of Survey C suggested that this initiative was strongly
welcomed by EC officials. At the same time, because of their formal nature, some concerns were
expressed by some interviewees that the cluster meetings could become an ‘artificial’ reporting
mechanism without a direct link to a strategy for more effective knowledge management. The
challenge of ensuring that clusters can be organised sufficiently quickly was also raised. For instance,
if there was a new emerging priority area, officials noted that such cluster meetings, and the reports
that were published following them and subsequently presented to the most relevant policy DG(s),
should be organised more speedily. It was suggested that the Agency should organise cluster
meetings within 2-3 months.
However, according to the interviewees from REA, the need to respond rapidly to requests made by
policy DGs posed human resource and technical capacity challenges. This suggested the need for
further capacity-building to strengthen REA staff knowledge and understanding of key policy
challenges. It furthermore suggested that there is a need for EC officials from different parent DGs to
decide what format over and above fact sheets policy outputs would be most useful in. It was also
noted by some interviewees from the Agency that there have been problems in receiving the same
questions through different information channels. The interviews with the CSC revealed that REA has
helped to develop a common approach to exploitation and dissemination between the EC services
and Executive Agencies through its participation in the CSC’s Executive Committee. However, some
parent DGs reported the need to further improve both communication channels, and the nature and
appropriateness of the information shared.
Furthermore, to strengthen access to information, the Agency aims to ensure that EC officials in the
parent DGs have easy access to relevant databases, efficient use of a common database and access
to better data mining using up-to-date tools (such as CORDA, OSIRIS). Some of these IT tools such
as OSIRIS were still being developed at the time of the interview programme in August/September
2018 to enable the Commission services to better exploit the relevant content of operational files
and to maximise the use of data for better policymaking in line with the Commission strategy on
data, information and knowledge management. As a result, not all stakeholders expressed positive
views on the current utility of this tool.
Regarding the extent to which the different responsibilities are being adequately communicated to
beneficiaries, interviewees observed that the external world, including beneficiaries, are not always
aware of the different roles and functions of the Commission, REA and of different helplines and
contact points. In particular, it was observed that the Research Enquiry Service (RES) and
Commission contact points sometimes receive very similar requests for information from the public
but through different channels. Since there are several entry points for some questions, it was
suggested by some interviews that this is not optimal from a beneficiary perspective since there is
no single contact point where they can obtain all necessary information.
Interviews with REA’s Unit A staff revealed that the CSC could not develop a tool to schedule the
large number of meetings linked to organising remote evaluations and to accommodate the
rescheduling of virtual meetings. As a result, one of the POs had to do scheduling by hand which was
highly time consuming. Another suggestion which would help the Agency deal with workload was the
development of a video conferencing system which would allow the Agency staff to connect with
experts working remotely more easily.
The consequence of the greater use of remote experts than had been expected was that REA has had
to undertake a lot of “learning by doing” in terms of managing remote evaluations in order to gain
experience. However, interviewees mentioned that further training will be needed for REA staff on
how to manage the process of involving remote experts during consensus. There is also an identified
need for training videos for experts serving as remote panels.

139
Summary of the focus group results

A meeting/focus group was organised on 17 January 2019 in Brussels at the


Commission’s premises to present and validate the findings of the evaluation. Altogether
19 stakeholders participated in the seminar, including members of the Steering
Committee, the Agency’s managerial staff and most of its parent DGs including DG RTD,
DG EAC, DG CNECT, DG AGRI and DG HOME. The main findings from the evaluation were
presented during the meeting/focus group. A summary of the focus group results is
provided in the table below.

Table 27. Summary of meeting/focus group results.

Recommendations Key points discussed during the meeting/focus group

Recommendation 1 Regarding draft recommendation 1, the contractor agreed to review and


rephrase the recommendation. It was agreed that the revised
recommendation 1 should better reflect the fact that REA has already
initiated a set of activities in several specific areas and business processes
mentioned in the recommendation text, and the focus should now be on the
consolidation of these processes. Given that the Agency demonstrated a
consistently high performance in terms of the KPIs, and the fact that all the
relevant data, including KPIs per programme, are available on CORDA, both
the Steering Group and REA agreed that no further action is needed in this
area. The contractor, nevertheless, also agreed to reflect further on the fact
that some internal communication issues at the Commission and a general
lack of knowledge about the business processes involved may explain a
certain level of dissatisfaction with the expert selection process and
procedures. Considering that draft recommendation 4 was dedicated to the
area of policy feedback, draft recommendation 1 should no longer list it as
one of the specific areas that require further standardisation either. It was
also agreed that the revised recommendation 1 should be primarily
addressed to the Commission emphasising that the Agency’s parent DGs
should participate more proactively in REA’s initiative to formulate their
specific needs in relation to each of the specified areas (i.e. selection of
independent experts for evaluations and validation of experts lists and
participation of EC officials in project-monitoring activities). Furthermore,
the contractor should reflect on the fact that EC officials from the parent
DGs should be better informed about their roles and responsibilities in
these processes, as set out in the MoU.

Recommendation 2 With regards to draft recommendation 2, it was agreed that the scope of
the recommendation should be limited to the implementation of potentially
policy-relevant actions. Any references to Horizon Europe should be
removed from the recommendation text given that the new framework
programme has not been adopted at the time of the evaluation. The
contractor agreed to move instead the lessons learned concerning the
flexibility needed in implementing cross-cutting calls and additional types of
funding instruments to the conclusions part. It was also agreed that the
revised recommendation 2 should be addressed both to the Agency and the
Commission.

Recommendation 3 Concerning draft recommendation 3 on the user-friendliness of the corporate


IT tools, the contractor agreed to reflect on the possible improvements that
could be introduced in this area in the conclusions text, however there
should be no specific recommendation mentioned as the issue is not directly
related to REA’s efficiency.

Recommendation 4 Regarding draft recommendation 4 on the establishment of specific policy


reporting mechanisms and business processes for policy feedback, the
recommendation was broadly accepted, and only minor changes were

140
needed. It was agreed that the revised recommendation should be primarily
addressed to the Commission and then to the Agency.

141
Annex 3: Interview Questionnaire

Interview questionnaires: interviews with EC and REA officials

1. Effectiveness

1.1. REA’s operation in accordance to its legal framework

During 2015-2018, the new Commission’s Decisions amending the mandate of REA demonstrated
the Commission’s commitment to achieving greater synergies and efficiencies in the management
of its programmes, with REA serving increasingly as a central provider of support services. To the
best of your knowledge, were there any problems related to this transitional period and in
particular to adjusting REA’s operations to the updated legal framework?

In your opinion, have there been any activities entrusted to REA which pose a risk in terms of
crossing the line between programme implementation/REA and policy development/EC sides? Are
you aware of any potential cases of potential micro-management of activities by the Commission?

In light of the Agency’s mandate extensions in 2015 and 2017, has REA carried out these tasks to
the highest extent possible (i.e. was fully coherent with the legal framework) or would you say
that activities in some of these areas could be given more attention? Did REA manage to absorb
the extended tasks and workload without loss of quality and effectiveness?

The Agency currently works under the rules established under the 2016 MoU. Is the MoU fully
operational and effective? Is it in a need of updates?

Since its creation REA has undergone continuous fine-tuning of its internal procedures and
organisational structures, streamlining its processes, integrating its activities in the IT solutions,
and updating documents for H2020 as a whole.
 In your opinion, was the operation of REA flexible enough to accommodate the key
changes, at the same time maintaining concordance with the legal framework establishing
the Agency?
 Are you aware of any challenges related to this transition faced by the Agency?
 What are the main gains from these changes?

1.2. Extent to which REA has achieved its objectives

Compared to the previous years, the 2015-2018 period was a period of relative stability for REA.
What were the key changes that the Agency went through during 2015-2018?

Generally, how would you assess the overall operation of REA in terms of its:
- Strengths (e.g. generally strong KPI results, particularly in execution on commitment and
payment appropriations, TTI, TTG, TTP, the evaluation process is widely recognised by
independent observers and evaluators, etc.);
- Weaknesses (e.g. staff turnover and its effects on the quality of the services, lack of
sufficient and coherent instructions on the management of conflicts of interest);
- Opportunities (e.g. further implement SEDIA with the support of corporate business and IT
solutions by which the process for handling information for procurements and grants would
be fully automated and integrated, consolidate REA’s practices to provide policy feedback by
further tailoring them to the specific needs of the various EC policy services to facilitate a
more systematic contribution to the Commission's policy developments, etc.);
- Threats (e.g. the continuous challenges faced in the legacy SME instruments)?

When it comes to performance management of REA (defining and estimating KPIs), is this process
effective? Do you see any challenges/problems in terms of defining KPIs and estimating them?
What could be improved in this regard?

Overall, would you assess the implementation of the delegated programmes FP7 and Horizon 2020
as effective? Were there any lessons learned (especially closing down of the FP7 project portfolio
and the consolidation of our H2020 project management covering all stages of the project life-
cycle)?

Is REA equally effective in all stages of proposal evaluation/project management (call/proposal


stage; contract negotiations; implementation of the projects/project management; follow-up and

142
audits)? In your opinion, which stages of those mentioned above could be further improved and
how?

1.3. Extent to which REA led to an improved management of the programme(s),


including in terms of simplification measures, proximity to addressees and
visibility of the EU

Examples of alternatives to the current Executive Agency management model include (1)
management of the programmes by the Commission; (2) mixed Agency–Commission
management; (3) partial management by the Commission while outsourcing some activities to the
extent legally possible. In your opinion, does the current model ensure the best management of
the programme and highest quality services to the stakeholders? Would you say that other
alternatives could improve the management of the programmes? In which ways?

Please share your opinion on whether the current model is optimal in terms of (1) addressing
resource constraints (esp. during periods of high workload); (2) capacity to provide specific skills.

REA has an objective to be as responsive as possible to the beneficiaries’ needs both


quantitatively (e.g. in terms of TTG and TTP) and qualitatively (in terms of the beneficiaries’
satisfaction with the services provided and overall involvement of the scientific community in the
management of the programmes). In your opinion, is the current extent to which resources and
procedures are delegated to maintain contact with beneficiaries appropriate? What could be
changed?

REA is expected to act in compliance with the Commission’s guidelines on information and visibility
of the programmes, as well as the instruments put in place to ensure the visibility of the
Commission as promoter of the programmes entrusted to REA. To the best of your knowledge, are
the appropriate funding/instruments in place to ensure the visibility of the Commission as a
promoter of the programme?

In your opinion, is the REA brand well known among the beneficiaries? What would you say are
the key perceptions about REA among beneficiaries and stakeholders? Would you say that these
perceptions have an impact on the effectiveness of the Agency? What kind of impact?

1.4. REA’s external communication

Balancing external communication: the previous evaluation of REA, which was finalised in 2016,
recommended REA to better inform the research community about the opportunity to work in the
capacity of an independent expert as well as overall opportunities to participate in EU-funded
research projects. How effective was the new dissemination campaign, which has been developed
via the Participant Portal, and a video prepared by REA to encourage experts to register?

Reports suggest that during 2015-2018 REA launched a number of information campaigns aimed at
the beneficiaries and the wider society. Those activities included:
- Various conferences, workshops and events promoting the programmes/calls and
disseminating the research results;
- Information campaigns highlighting key management and control issues, such as the mass
information campaign highlighting actions to correct the errors in the SME instrument;
- News alerts are sent to stakeholders interested in REA’s activities.

Which of these activities prove to be most effective and most valued by the beneficiaries? To what
extent could the beneficiaries impact the programme/set the agenda for these communication
events?

2. Efficiency

2.1. Detailed deconstruction of key steps of the call/application/project management


life-cycle

To be discussed – operational efficiency and challenges faced during various programme stages:

 Preparation stage: consultations and cooperation with the Commission in preparation of


work programmes, managing calls publications, budgeting;

143
 Application stage: information for applicants; reception of proposals; admissibility and
eligibility check; evaluation; respective indicators (time-to-inform, redress, budget
execution/success rates, etc.);
 Conclusion of grant agreements: validation of beneficiaries; financial capacity and cross-
checks; grant finalisation and contracting; respective indicators (TTG);
 Follow- up and monitoring: pre-financing; interim and final payments; verification of the
eligibility of costs declared by the beneficiaries; reporting; amendment of grant
agreements; respective indicators (TTP, TTA);
 Ex post controls and recoveries: audit strategies and plans; ex post audits;
corrections/recoveries.

Horizontal questions:

 Horizontal questions: administrative novelties and simplifications in 2014-2020


programmes, especially those introduced during 2015-2018; impact and further scope for
simplifications; rules and procedures throughout programme/project management cycle
(clarity, stability, suitability, efficiency, etc.).
 Identification and discussion on key workload drivers pertaining to various stages of
programme/project life-cycle.
 What aspects/means/processes render REA more or less efficient? What could be
improved?
 To what extent have REA’s internal organisation and procedures been conductive to its
efficiency?

IT systems, employed for application and grant management:

 The beneficiaries’ survey carried out during the previous evaluation of REA in 2015 showed
that satisfaction with the user-friendliness of the IT tools employed for application,
contracting and grant management among beneficiaries was quite low. In your opinion,
has (how) the user-friendliness and functionality of the IT tools improved during 2015-
2018?
 How does functionality and user-friendliness of the IT systems satisfy internal needs of
REA during various management stages of programmes (receipt and evaluation of
applications, contracting, grant management and ex post controls)?

2.2. KPIs and overall efficiency of REA’s activities

General discussion on the achievement of REA’s key performance indicators and the progress
made in 2015-2017. Some of the more specific issues to be addressed during interviews include:

- Budget execution of commitment and payment appropriations. Reports show that


the operational commitment and payment appropriations from the EU general budget
were fully executed.

- Time-to-grant (TTG). TTG figures show an overall improvement over 2015-2018


compared to 2012-2015, with H2020 calls significantly outperforming 2012-2013 FP7
programmes. In 2017, REA managed to sign all grant agreements in less than 245 days
(up from 99 % in 2016). Over time REA has steadily improved its TTG performance, the
average TTG for H2020 during 2014-2017 fell from 222 to 193 days. Which factors
influenced such results (e.g. simplifications, improved procedures, etc.)?

- Redress cases. The target for the Agency is max. 0.5 % of redress cases upheld. Actual
data show that this target was achieved for 2014-2016 calls (however for 2015 calls the
share of redress cases upheld reached 0.46 %). What has been the trend in the number of
complaints received? Is there an increasing trend due to the growing number of
proposals/lower success rates in some actions?

- Error rate. The target ERR for FP7 was set at 2 % (corresponding to the materiality
threshold), for H2020: between 2 and 5 %, with the exception of MSCA where the target
was set at 2 %. The data show that target residual error rate for FP7 was met for only
MCA (ERR at 1.55 in 2017), whereas for Space and Security it stood above 3 %
(estimated under CRaS sample) and for SME actions – 5.8 %. The first results of H2020
audits show that estimated H2020 residual error rate stands at 2.24 % (based on
Common Representative Sample) and for the MSCA it is expected that the error rate will
remain below 2 % (based on the historical results of FP7 and simplified H2020 rules).

144
o What are the inherent risks in the above-mentioned programmes (e.g. grants
based on actual costs and corresponding reimbursement mechanism, complexity
of the eligibility rules, low level of control, subcontracting in SME instrument,
fraudulent behaviour)?
o What are the most recurring errors in FP7 and H2020 programmes (e.g. wrong
calculation of hourly rates, absence of time recording systems, wrong calculation
of overhead rates…)
o The error rate for FP7 actions generally increased since 2014 – was this the result
of more intensive ex post controls introduced in later years of FP7
implementation?
o Which corrective actions were introduced in response to identified
weaknesses/high error rate?
o As of 2015 REA introduced additional ex ante controls for SME actions (requesting
SMEs to provide evidence that RTD performers’ invoices have been issued,
registered and paid) to make sure that SME beneficiaries comply with the rules –
what have been the experiences to date?

- Time-to-pay (TTP): during 2017 97 % of project-related payments were made on time,


the average TTP stood well below the set targets for all types of payments. What factors
drove this result?

- Average grant size & number of projects per officer. The AARs suggest that for some
of the newly delegated actions, significantly lower grant size was only partly
counterbalanced by a slightly reduced delegated budget. What measures have been/will
be taken to address this situation, how would you assess adequacy of the workload and
allocated human resources?

- Achievement of projects’ objectives. The KPI indicating the share of closed projects
that achieved all or most of their objectives reached about 96 % in 2015-2017. The target
set by REA is 90 %.

- Remote evaluations. In 2013 REA started using remote evaluations, thus considerably
reducing the travel and accommodation costs of experts. In 2016 REA introduced fully
remote evaluation procedures for some calls that had a high numbers of proposals and use
the central evaluations taking place in Brussels only for the most difficult and complex
cases and for panel meetings. What are the key risks associated with this move and how
has the Agency mitigated them?

2.3. Key simplification measures introduced during 2015-2018:

We understand simplification as the introduction of new and better management and


implementation arrangements: simplification, streamlining and harmonisation of funding rules and
procedures within and across different programmes and programme strands; wider use of systems
for electronic data management and electronic data exchange between the administration and
beneficiaries, as well as user-friendliness of the employed IT systems; simpler forms of grants
(such as use of standard cost options (SCOs) – lump sums, standard scales of unit costs, flat-rate
financing), and other forms of simplification.

Examples of simplifications introduced in programmes managed by REA under Horizon 2020


during 2015-2018 include:
- Increased use of electronic workflow (paperless processing of grants);
- Increased use of (fully) remote evaluations;
- Improved and automated procedures for allocating proposals to the most suitable persons
(automated pre-allocation of proposals to experts based on the best match of keywords
related to scientific field of proposals);
- Simplified H2020 rules with regard to payments and eligibility/financing conditions (e.g.: a
single funding rate for all beneficiaries and all activities in the same grant; indirect costs
covered by a single flat-rate of 25 % applied to the direct costs; no time-sheets for
personnel working full time on a project; a wider acceptance of average personnel costs;
wider use of SCOs in MSCA, etc.);
- Other simplification measures introduced in H2020 (e.g. limitation of the mandatory ex
ante financial capacity check to private coordinators only and in case of funding in excess
of EUR 500,000; “No negotiation” approach);
- Streamlined contracting and payment of experts;
- Simplified expert contracts;
- Streamlined validation of legal entities, etc.
145
Could you please specify the main simplification measures and administrative novelties
introduced in the programmes managed by REA (your unit/department) during 2015-2018?

To what extent did these simplifications affect REA’s performance and KPIs? To what extent did
these simplifications affect REA’s workload and ability to monitor and control the projects,
foreseen impact on error rates?

Client feedback and satisfaction: are there any indications which imply increased satisfaction from
the beneficiaries? Did these simplification measures produce any unintended effects?

Future scope for simplifications (e.g. further streamlining of management procedures; wider
use of simplified cost options (SCOs); further digitalisation and automation of application/grant
management process, etc.). Please, share your views on what type of new simplification measures
(and in which areas) could be further introduced focusing on the measures that, in your opinion,
could potentially contribute to the reduction of administrative burden both to the beneficiaries and
the Agency and that could enhance the capacity to adapt to periods of high workload.

2.4. Management and provision of central support services

Under Horizon 2020 REA’s mandate was further extended to provide administrative and logistical
support services to all entities involved in Horizon 2020 management; it was also expanded to
certain programmes for health/consumer protection, competitiveness and innovation, as well as
education, culture and citizenship. Overall, would you assess the implementation of the support
services as effective? Are there any challenges to the Agency’s capacity to cope with the
significant increase in support activities?

There has been a significant increase in the support activities provided by REA and the 2018
Annual Work Programme suggests that the Agency will take on some additional tasks in the
future. Some notable examples of REA’s support activities include:
 Processing of the validation of new participants and the integration of procurement
activities as REA assumes responsibility for the legal validation of third parties and their
financial assessment (for all grants and procurements under direct management).
Additionally, REA is responsible for validation of the legal information relating to third
parties in all grant and procurement procedures implemented by Commission services,
Executive Agencies and Joint Undertakings under direct management123.
 Further implementation of SEDIA with the support of corporate business and IT solutions.
 Assessment of the financial capacity of beneficiaries of research grants. In 2018, the scope
of the financial assessment will be enlarged to include participants in operations managed
by the new SEDIA clients in line with the legal provisions and the specific requirements of
their programmes.
 In collaboration with DG BUDGT, REA seeks to design and harmonise the rules for the
validation of third parties across the Commission bodies. Moreover, REA contributes to the
development of the corporate IT tools that will support an effective implementation of the
validation services for all beneficiaries and Commission bodies.
 The Agency operates the Research Enquiry Service addressing questions raised by the
public on Horizon 2020 and other research matters and feeding the Frequently Asked
Questions (FAQ) database.

How has the delegation of the additional services affected REA’s effectiveness, efficiency and
workload levels? Do you see the transfer of the additional services as optimal for the Agency?

Within the SEDIA framework, REA is responsible for the provision of certain administrative and
logistical support services to applicants for both grants and procurement activities, including the
first level of indirect management transactions, for all Union programmes. The following
administrative and logistical support services for any programme which is part of the SEDIA
framework:
- Validation of legal entities, including natural persons participating in grants and public
procurement procedures;
- Preparation of legal entities’ financial viability assessment.

123
REA 2018 Annual Work Programme, p. 30.
146
What have been the main challenges associated with the execution of the above-mentioned
services? What corrective measures were taken to address them? What measures are taken to
ensure the quality of external experts?

Simplifications introduced: through review of the AAR’s, AWPs and other reports we identified a
number of simplifications, including:
- The replacement of paper (blue-ink signed) documents by their electronic (scanned)
counterparts;
- The removal of the obligation to provide certified official translations of documents in non-
EU languages;
- The identification of default lists of roles/jobs/functions that could be recognised as legal
representatives without requiring further proof documents;
- Harmonisation of the different financial assessment methodologies which allow the use of
a harmonised set of core financial data which are gathered from the supporting documents
provided by the participants only once;
- The use of the upgraded SME tool, which helps entities to determine whether or not they
qualify as an SME.

Have there been any other simplification measures introduced? What effect did these
simplifications have on the effectiveness and efficiency of REA’s central support services? Have
there been any plans to introduce additional simplification measures?

Key operational risks: the AARs and other reports mention some risks related to support services
as REA assumed responsibility for the legal validation of third parties and their financial
assessment (for all grants and procurements under direct management) such as the delayed
processing of validation for new participants and the integration of procurement activities. What
corrective measures were adopted to mitigate these risks, and were those measures successful?

2.5. Administrative budget and expenditures

General discussion on the evolution of the administrative budget and its main budget titles during
2012-2015:
- Title I. Staff expenditure comprises the following cost items:
o Remuneration, Allowances and Charges;
o Professional Development and Social expenditure, etc.
- Title II. Infrastructure and operating expenditure includes:
o Building expenditure;
o ICT expenditure;
o Movable property and Current operating expenditure, etc.
- Title III. Programme support expenditure:
o External meetings and information days, evaluation and review, missions and
related expenditure, programme management specific IT systems, communication
and publications, etc.
According to the AARs, REA's administrative budget, as a share of the operational budget, was
2.6 % during 2016-2017 (excluding provision of central logistical and support services) and was
below the figures for the previous years. What components drove this development to the largest
extent?
What are further developments in the Agency’s operational costs and what are their drivers? Have
the necessary provisions been sufficient and made on time?
Staff costs (discussion on their evolution for the different types of staff): according to Article 18 of
Regulation (EC) No 58/2003, the staff of the Executive Agencies consists of:
- Officials seconded in the interests of the service and engaged by the Agency as temporary
staff within the meaning of Article 2(a) of the Conditions of Employment of Other Servants
(CEOS) (AT2a) in positions of responsibility.
- Temporary staff within the meaning of Article 2(f) of the CEOS (AT2f). Officials seconded
at their own request by an Institution are engaged as AT2f.
- Contract staff within the meaning of Article 3a of the CEOS (AT3a). The number of
contract staff may not exceed 75 % of the total number of staff employed in an Agency.
- The Agency may use seconded national experts (SNEs) and other types of external staff
under private-law contracts.

Has REA faced any difficulties/challenges in maintaining appropriate staff levels within the
categories mentioned above? If so, what are the reasons for this?

2.6. 2013 CBA and SFS

147
Preliminary analysis shows, that while actual operational budget allocated to REA was lower than
estimated in the 2013 CBA/SFS during 2014-2015, it reached the initially estimated values during
2016-2018. However, there was a significant variation across the programmes – while for some
programmes (FET Open, MSCA, SEWP) the actual operational budget was higher than initially
estimated, for other programmes (Societal Challenges, Space, SwafS) it was lower. Also, for some
programmes the annual budgets significantly fluctuated during 2014-2018 and the average grant
size deviated from initial CBA estimations in some programmes (e.g. in Space and SC7 it was
much lower, while in FET Open and MSCA RISE it was higher than initially estimated).

Thus, the actual workload of REA deviated from the initial CBA/SFS estimates to a significant
degree.
- How do you monitor workload related to management of different programmes and
implementation stages (programming, evaluation and selection; conclusion of grant
agreements; project monitoring and payments; ex post controls and recoveries)?
- Do you have any special workload/productivity assessment/monitoring exercise/indicators
(special workload assessment model was prepared for 2014-2020 MFF and used for
2013CBA, is it being further used/updated)?
- How do you adjust resources for the management needs(and horizontal tasks) of various
programme management needs?

2.7. Internal control system & audits

Overview of REA’s internal and external control systems, particularly:


- REA’s Internal Audit Capability (IAC);
- The Commission’s Internal Audit Service (IAS);
- The European Court of Auditors (ECA).

Discussion on key mechanisms designed to monitor the functioning of the internal control
systems:
- Bi-annual reports submitted by the Heads of Unit in their capacity of Authorising Officer by
Sub-Delegation (AOSD);
- Opinion on the state of control of REA's Internal Control Coordinator (ICC);
- Outcomes of activities of the ex post audit function and fraud prevention measures;
- Independent opinion of REA's Internal Audit Capability (IAC) on the state of internal
controls;
- Observations and recommendations reported by the Internal Audit Service (IAS);
- Observations and recommendations reported by the European Court of Auditors (ECA);
- Observations and recommendations reported by DG BUDG (in the context of the validation
of the local accounting systems by the Commission's Accounting officer).

Multiple reports suggest that the Agency’s management has reasonable assurance and that
overall, suitable controls are in place and working as intended. The risks are being appropriately
monitored and mitigated, and necessary improvements and reinforcements are being
implemented. However, the Director, in his capacity as Authorising Officer (AO) for the
administrative budget and Authorising Office by Delegation (AOD) for the operational budget,
expressed reservations concerning FP7 Space and Security themes and FP7 SME actions in
2016124.
- To what extent and how have these risks been mitigated (e.g. through additional ex ante
and ex post controls)?
- What have been the key intrinsic risks in the delegated actions?

The audit of H2020 Grant Management in REA concluded 125 that although REA has set up an
efficient internal control system for the grant management steps from the calls preparation,
evaluation of the proposals to the signature of the grant agreements and for providing
administrative logistical services to other entities implementing H2020, there is still a significant
weakness in its effective implementation. In this context, the IAS has identified an issue on the
management of conflicts of interest:
- Lack of conflicts of interest (CoI) internal guidelines;
- Late detection of CoI;

124
The REA 2016 Annual Activity Report, p. 12.
125
Final Audit Report on H2020 Grant Management in the Research Executive Agency: part a) from the
preparation of the calls for proposals to the signature of the grant agreements, part b) administrative logistical
services provided for H2020, 13/05/2016 pp. 4-5.
148
- Breach of the data protection and confidentiality rules by an evaluation expert or no timely
response to those type of incidents.

Has the Agency introduced any measures to provide clear procedures and guidance on the roles,
responsibilities and the coordination between the operational and the contracting units regarding
actions to be taken and procedures to be followed in case of breaches of confidentiality rules and
unauthorised processing of personal data?

Since 2015, REA carries out an annual ex post verification on a number of entities to confirm their
SME status. In addition, an ex ante validation is carried out by REA, if requested by the concerned
entities participating in the SME Instrument where, the SME status is an eligibility criterion for
participation. The REA’s 2018 Annual Work Programme suggests that a new approach will be
defined for the ex post controls on SME status. Why are the additional controls being introduced?

One of the reports states that the target of a maximum residual error rate of 2 % cannot be
reasonably met without a massive increase in the number of audits or in the administrative
burden imposed on participants through widespread ex ante controls.
- Are there any alternative approaches that could improve the residual error rate without
such extensive controls (e.g. more extensive use of flat rates, unit costs and lump sums)?
- Would additional automated tools aimed at detecting errors/fraud help reduce the
Agency’s exposure to risk?

2.8. IT tools

During 2015-2018 REA remained active in the governance of the common IT tools used for the
support services, in particular PDM/Beneficiary Register and EMPP/EMI for which it has a leading
role.
- How would you assess the appropriateness/functionality of the corporate IT tools at this
point?
- How would you assess the availability/appropriateness/functionality of the common H2020
dedicated IT tools?

Staff training: the AARs and other documents imply that REA staff have been intensively trained
on H2020 procedures and tools, FP7 procedures and tools (notably with support from PROMIS).
Has the Agency collected feedback on these training activities and monitored the progress?

IAS follow-up of the audit on H2020 grant management in REA issues a recommendation to the
Agency to ensure that the communication between REA and the beneficiaries is stored
systematically in COMPASS. In parallel, the CSC was asked to improve COMPASS/Sygma to
facilitate the project officers’ user-experience when using these systems as a communication
channel. Have any measures been taken by the Agency to respond to these recommendations?
With what results?

Future plans and developments with IT (discussion on the effectiveness/efficiency of the tools,
possible future actions):
- REA is currently testing a new IT system (ARIS) which aims to provide automatic and
streamlined detection, whereby text-mining is applied to the list of participants submitted
proposals and to the pool of experts linked to a certain call;
- Other IT systems?

2.9. HR management

According to the 2016 Commission staff survey, REA staff engagement index (67 points) was
similar to the EA average. The Agency scored well in terms of overall job satisfaction (76) points.
What main strengths and weaknesses has the Agency identified and prioritised since 2016?
Overall, how does the Agency follow up on the findings of the latest staff survey (2016)?

The previous evaluation of REA found that staff members of the Agency were generally satisfied
with the content of their jobs in the Agency. However, they were much more negative about the
career opportunities and professional future within the Agency. Therefore, it was recommended to
raise the issue of staff motivation and career opportunities with the European Commission and
encourage internal mobility as well as other options of professional development. How did the
Agency address this recommendation?

How are provisions on overtime and teleworking implemented in the Agency?

149
The performance audit on Management of HR in REA was carried out by the IAS in 2015. Among
its conclusions, the IAS audit identified several strengths in REA’s HR management. However, the
audit has also concluded that significant improvements are needed in relation to the management
of the selection process for contractual agents. What measures did the Agency undertake in this
area? Have these measures helped to address the issues that were previously interfering with the
management of the selection process for contractual agents?

In the context of the efforts to enhance staff mobility, to what extent have the following measures
been effective?
- General Implementing Provisions governing the conditions of employment for contract
agents (adopted in October 2017) to enhance the mobility of contract agents between the
Executive Agencies and the Commission services;
- The implementation of the inter-Agency mobility for Temporary Staff 2f in coordination
with the other Executive Agencies.

The inter-Agency labour market became operational during the evaluation period. What were
REA’s experiences with this new opportunity? What were the main benefits and risks of opening
the inter-Agency labour market?

Planning of human resources: The Agency’s work programme must include a clear presentation of
the operating budget of the Agency, as well as the staff figures with an indicative breakdown per
programme and, within each programme, per activity and with an indicative breakdown for the
different types of agents. Based on your experience, have these plans accurately predicted the
actual demand for human resources in the different programmes and activities? How would you
assess adequacy of the staffing level and workload, possibilities to adapt to changing workload?

High staff turnover is a well-documented structural weakness of the Executive Agencies which
negatively affects their performance and results in higher programme continuity costs, opportunity
costs of staff replacement and general loss of know-how in the organisation.
- Staff turnover in 2015-2018 (Staff vacancy in 2015-2018); to what extent and in which
areas has staff turnover affected the Agency’s performance?
- Degree of difficulty to find the right profiles (time spent to recruit specific profiles and
number of candidates for each job advertised);
- What are the main reasons for staff turnover (e.g. relatively low salary level of CAs,
other)?
- To this end, how can HR policy help mitigate these risks (e.g. specific policies whose goal
is to retain the Agency’s top performers)?

Discussion on the main HR management activities and policies implemented during 2015-2018:
- Maintaining its specific objective regarding the time to fill vacant posts (5 months from the
moment a post becomes vacant and the effective recruitment);
- Enhancing the staff engagement index (by focusing on the Competency Framework, the
Learning and Development Framework and internal mobility to better match the
competencies to the positions available);
- Further improving the well-being of staff (tracking the results of the European Commission
Staff Surveys, introducing measures to promote a better work/life balance and a healthy
work environment, adopting the Learning and Development Strategy in 2018 with a
special emphasis on well-being);
- Coordination and consolidation of the recruitment process (a selection for temporary staff
was launched in early 2018 to have a reserve list available at the end of the year);
- Promoting gender equality (ensuring gender balance in middle management positions as
included in the standard reporting of REA to the parent DGs);
- To support the activities of the REA Staff Committee and moderate the social dialogue
between the Committee and REA management (with a particular focus on awareness on
psycho-social and physical well-being and managerial peer learning and sharing best
practice);
- Analysing job profiles in the context of the phasing out of FP7 projects and of the design of
the new financial rules planned for FP9;
- Work on the results of the 2016 European Commission Staff Satisfaction Survey and
further implement the action plan for REA resulting from the 2014 Survey.

3. Knowledge management, coherence and added value of REA

3.1. Overall cooperation between REA and its parent DGs

150
The parent DGs are responsible for the preparation of the bi-Annual Work Programmes. How does
REA contribute to the preparatory work of work programmes on issues regarding the
implementation aspects (incl. launch of calls, dissemination, use and communication of results)?
Where else does it contribute? Overall, would you assess this process as effective and smooth?
Were there any lessons learned during 2015-2018?

There were substantial delays to the introduction of the 2014 AWPs due to the late adoption of
H2020. Were there any similar delays during 2015-2018?

The parent DGs define a supervision strategy aimed at avoiding gaps or duplication of efforts
resulting from crossover between their monitoring and supervision tasks and the execution tasks
of the Agency. The main provisions are established in the MoU.
- Overall, does clear delimitation of responsibilities and tasks between REA and the parent
DGs exist?
- Are there any examples of formal and informal activities undertaken by REA that could be
classified as policymaking?
- Are there any examples of the parent DGs potentially undertaking programme
implementation tasks otherwise entrusted to REA?

According to the EC guidelines for the establishment and operation of Executive Agencies financed
from the Union budget, the organisation charts of the Agencies are reflected in the Commission's
organisation charts as follows: the structure of the Agencies' organisation charts is reproduced in
full or in part in the organisation charts of the parent DGs (‘mirror’ chart). What are the practices
applied in different programmes/parent DGs? Are the current arrangements optimal for both REA
and the Commission?

3.2. Extent to which REA’s communication function supports the mission of the
Agency and ensures an effective feedback loop with the parent DGs and other
stakeholders

Discussion on the main communication/policy feedback mechanisms by programme/parent


DG/unit:
- Key types of mechanisms used;
- Role of the seconded staff in the knowledge exchange;
- Frequency of communication and knowledge exchange between REA and its parent DGs;
- Overall satisfaction with the process (quality, clarity, relevance), further possibilities for
improvement;
- Take-up of the evidence and knowledge created by REA.

The previous evaluation recommended the Commission and REA to consolidate the structured
dialogue between the Agency and its parent DGs to improve the feedback of project-related
information into policymaking in line with the MoU. As a result, the Agency has taken a number of
measures to ensure good cooperation between REA and the Commission, focusing on 1)
cooperation within REA (e.g. establishment of a Task Force on policy feedback), 2) cooperation
with parent DGs (e.g. development of a multi-faceted cooperation and policy feedback mechanism
at the level of delegated programme and establishment of a Dissemination, Exploitation and
Communication Group), 3) cooperation with the CSC (e.g. coordination and streamlining of
processes across the Research family). How effective were the new measures in improving
information flow between REA and its parent DGs? What changed and what is still insufficiently
addressed?

In terms of the review of the progress of the actions’ implementation, REA is currently exploring
the possibility of moving from individual project review/ check meetings, towards clustered
monitoring meetings organised in Brussels for selected topics. What are the expected gains from
this possible action? What are the potential costs/risks?
Based on the AARs, REA currently supports the communication of DGs in providing examples of
successful projects and their impact (i.e. success stories, organisation of events). The input
provided by REA is used to support parent DGs' communication activities, such as in the "EU
Empowers" Campaign of DG COMM, the MSCA Campaign of DG EAC, thematic months of DG RTD.
- How effectively are REA’s programme units using the information from the managed
projects (grants and tenders) to provide analyses, lessons learned and project results
useful for monitoring purposes, informing policymaking and future project implementation
to its Parent DGs during 2015-2018?
- What would be the advantages and disadvantages of the change in the way projects are
monitored? In your view, are there other ways to gather and communicate policy feedback
in a more effective way?
151
3.3. Coherence and added value of REA’s activities

External coherence of the Agency: to what extent have there been overlaps/ gaps/ inconsistencies/
complementarities within the programme portfolio and support services managed by REA?

REA serves a twofold mission in that it a) implements R&I programmes and tasks supporting EU
funding for R&I; and (ii) provides administrative and logistical support services for participants in
the Framework Programmes and other Commission services. Do you see this arrangement as
optimal, given that the portfolio of central administrative and logistical services is likely to further
grow in Horizon Europe?

What are the advantages – and possible drawbacks – of combining the different areas of
responsibility in the Agency’s remit?

Internal coherence of the Agency: from an organisational point of view, how coherently does the
Agency deal with the different aspects of its remit? Are appropriate human resources allocated to
the implementation tasks? How are work cyclicality/peak workload levels dealt with?

Is REA provided with sufficient resources to carry out its horizontal/administrative tasks?

To what extent would the closing down of REA result in losing significant know-how in relation to
the management of the programmes entrusted to REA? Could you provide concrete example of the
Agency’s added value (or losses in case of its closure)?

3.4. Lessons learned and planned future actions

The previous evaluation report made the following recommendations:


- Continue the cooperation between the Agency and the parent DGs while planning the calls
calendars in order to reduce the cyclicality of workload, particularly in the newly delegated
programmes to REA and the programmes with multiple parent DGs and/or units in the
Commission. A more explicit coordination mechanism should be adopted for this activity.
- Continue improving the external communication of the Agency, specifically:
o Better inform the research community about the opportunity to work in the
capacity of an independent expert as well as overall opportunities to participate in
EU-funded research projects;
o In areas related to REA’s provision of central support services to grants not
managed by the Agency, clear communication strategies should be developed
together with other Executive Agencies in order to ensure that the beneficiaries
are aware in advance about REA’s role in the awaiting procedures;
o Improve the visibility of the Research Enquiry Service and increase the
stakeholders’ awareness of it in the new Horizon 2020 Helpdesk setting.
- Continue improving the functionality and user-friendliness of the IT systems used in the
project life-cycle management processes in order to reduce administrative burden,
improve productivity and enhance the overall user-experience for applicants and
beneficiaries.
- The Commission should explore ways of how the career and professional development
prospects of the Executive Agencies’ staff could be improved and revise the corresponding
provisions of the staff regulations (e.g. by removing constraints related to set-up of an
internal job market between the Executive Agencies). The REA should continue efforts
aimed at addressing HR challenges and further pursue its efforts to facilitate staff
engagement and enhance career development.
- Establish a structured dialogue between the Agency and the parent DGs to improve the
feedback of project-related information into policymaking in line with the Memorandum of
Understanding developed between REA and its parent DGs. Attention should be drawn to
analysing the specific information needs of the Commission for policymaking and
determining the mechanisms that could facilitate its provision.
- Continue close monitoring of the actual workload and the factors contributing to its level,
redistributing the administrative resources if needed. As the number of REA staff will
further increase in 2016-2020, request additional resources to be allocated according to
the results of the related analysis.
- Priority should be given to further consolidation and optimisation of the central support
services, including the newly delegated services to the Agency in Horizon 2020. In the
medium and long term, the costs and benefits of any new delegations of the services to
REA, particularly those not related to the Research family, should be carefully assessed.

152
What concrete measures and actions were taken to address these recommendations? Did REA
create an action plan to address the recommendations? Is it possible to get access to this action
plan?

What further actions, in your opinion, are required to improve the Agency’s work in the future?

Interview questionnaire: interviews with REA’s beneficiaries

General questionnaire:
 Please provide us with the following details: Type of your organisation,
country, year of application.
 How many REA-funded projects are you currently involved in? Could you
please list the names of the projects? Is/are the projects completed or still
ongoing?
 How are these projects funded? (e.g. grant, service contract, open call for
tender).
 How did you find out about the funding opportunity?
 Do you have any previous experience with EC funding through other
instruments? If yes, please provide details (name of project, dates, funding
levels).
 How many times have you applied for REA funding.
 Did you have any successful projects or unsuccessful applications during the
previous funding period? Please provide details. Did you receive feedback on
your application and if so, how helpful was this?
 If your application was not successful, did you nevertheless proceed with
your project and what effect, if any, did the rejection have on it?
Application procedure:
 How easy (or otherwise) did you find it to complete the application
procedure?
 Did REA provide you with all the information you needed and was it
responsive to any questions you may have had?
 How long did REA take to make a decision on your application? Do you regard
the time it took as reasonable?
 Did you receive feedback on your application? How useful was this feedback
in informing future applications?
Contract negotiation:
 If your application was successful, how satisfactory were the contract
negotiations? What could be done to improve this stage of the procedure?
Project implementation:
 How well in your view has REA managed the programme and the procedures
relating to your project (financial, monitoring and reporting procedures,
etc.)? What could be done to improve the way projects are administered?
Overall:
 If you had REA-funded projects during the previous programming period,
were there any notable differences in the application process and project
monitoring? Please provide details of any beneficial changes or any changes
which you believe had a particularly negative impact.
 Overall, how efficiently and effectively is REA performing from your
perspective?
 What could be done to improve the way the Agency operates?

Specific questions regarding your project:

Communication with REA:


 How regularly are you in contact with your desk officer at REA? In your
opinion, is this level of contact satisfactory?
Contract:

153
 How clear did you find the conditions of your contract and the expenditure
eligibility requirements? What, if anything, could be improved?
IT tools:
 How satisfied are you with the grant management tools (e.g. online portal,
submission of deliverables feedback on deliverables etc.)? What, if anything,
could be improved?
 Have you experienced any difficulties in using the online portal for electronic
submission? If so, were these dealt with in a timely and appropriate manner?
What could be improved?
Queries:
 Have you raised any queries regarding the day-to-day management of your
project? In your opinion, were these resolved in a timely and satisfactory
manner? If no, what could be improved? Please provide details.
Reporting requirements:
 How appropriate are the reporting requirements of the project in terms of
additional workload? If inappropriate, could anything be improved to reduce
the administrative burden in this area?
Deadlines/Guidance available:
 Have you experienced any difficulties in meeting deadlines for deliverables
and reporting (technical and financial)? If yes, please provide details of the
problem and how this was resolved? Is there anything that could be improved
with regard to project management and the guidance available?
Payments:
 Have all payments to date been received on time? How satisfied are you with
the timeliness and efficiency of payment processing procedures? Please
explain any extenuating circumstances.
Do you have any further comments regarding REA’s communication with you, and the
performance of REA staff?

154
Annex 4: Final Survey Questionnaires

The questionnaire of the survey of REA beneficiaries: Horizon 2020

Background information about the survey

Dear [contact(’first name’)][contact(’last name’),

The purpose of this survey, which is being conducted as part of the ongoing evaluation of the
Research Executive Agency, is to measure your satisfaction with the services provided by the
Agency, and to identify areas for improvement.

The questionnaire follows the project management life-cycle and contains the following
sections: application phase, evaluation phase, grant finalisation phase, REA validation services,
project implementation phase, REA’s communication and interaction with you and the general
section on the overall performance of the Agency.

The questionnaire consists of a series of questions related to your specific experiences in


contracts managed by the Agency. REA’s administrative data indicates that your organisation
has been awarded the following contract: [project name], contract number [contract number].
Where relevant, the survey questionnaire will indicate questions for which you need to reflect on
your experiences in the specific contract.

No personal information will be revealed publicly, in full compliance with our Specific Privacy
Statement on Personal Data Protection. Your data will be strictly anonymised, and only
anonymised data will be shared with REA and the European Commission.

Thank you in advance for your cooperation!

I agree to participate in this survey


Yes

II. Application phase

1. Overall, how did you find out about the funding opportunities managed by REA?
Multiple answers are possible.
Research Participant Portal
Annual Work Programmes
National Contact Point
Another national source (e.g. research ministry,
an EU liaison office at my university)
Recommendation by colleagues, superiors etc.
A European Commission website (e.g.
FP7/H2020 portal, REA website, CORDIS)
EC social media feed
Online advertising (e.g. ads in search engines or
social networking sites)
EU event or promotional material (e.g. an info
day, an EU info stand at a conference)
Research Enquiry Service
Media (e.g. a specialised magazine)
My work or invitation as an expert evaluator
Other (please specify)
Do not know/cannot answer

While submitting your proposal for [project name] you underwent a number of administrative
procedures. We would like to learn more about your experiences in this process.

155
2. Please assess the following aspects of the application process for [project name]:
Please select one option in each row.
Strongl Rather Neither Rather Strongl Do not
y agree agree agree disagre y know /
nor e disagre cannot
disagre e answer
e
Information for applicants
(e.g. about the call objectives,
eligibility and selection
criteria, documentation
needed, etc.) was easy to
find
Information for applicants
(e.g. about the call objectives,
eligibility and selection
criteria, documentation
needed, etc.) was clear
I knew who to contact for any
question(s) I had or where to
get help when preparing my
application
I knew who to contact for any
question(s) I had or where to
get help when submitting
my application
The requirements for the
application process (e.g. the
volume of proposal,
requirements for supporting
documents, etc.) were
reasonable and
proportionate
The proposal templates were
well structured and easy to
follow
The electronic tool used for
submitting the application was
user-friendly
Other practicalities (please
specify if relevant, otherwise
select the ‘Do not
know/cannot answer’ option)

3. How would you assess the overall timeliness of the following processes of [project
name]?
Please select one option in each row.
Strongl Rather Neither Rather Strongl Do not
y agree agree agree disagre y know /
nor e disagre cannot
disagre e answer
e
The time period from the call
deadline to the time the
outcome of the proposal was
announced to you (i.e. time-
to-inform) was appropriate

156
Strongl Rather Neither Rather Strongl Do not
y agree agree agree disagre y know /
nor e disagre cannot
disagre e answer
e
The time period from the
announcement of your
proposal’s outcome to the
time you signed the contract
(i.e. time-to-contract) was
appropriate
The overall time period from
submission of the proposal to
signature of the grant
agreement (i.e. time-to-
grant) was appropriate

4. Would you have any comments regarding the application phase of REA’s calls? In
particular, what simplification and service improvement measures would you
suggest?

III. Evaluation phase

We would like to learn more about your experiences during the evaluation phase in [project
name].

5. To what extent do you agree with the following statements referring to the expert
reviews and communication of the result of your proposal’s evaluation?
This question is about your specific experiences during the evaluation phase in [project name].
Please select one option in each row.
Strongl Rather Neither Rather Strongl Do not
y agree agree agree disagre y know /
nor e disagre cannot
disagre e answer
e
The results of my application
via the Participant Portal
were easy to access
The individual reviews and
panel comments provided
were clear
The individual reviews and
panel comments provided
were useful in
understanding the
strengths and weaknesses
of my proposal
Overall, the evaluation
process was transparent
Other practicalities (please

157
Strongl Rather Neither Rather Strongl Do not
y agree agree agree disagre y know /
nor e disagre cannot
disagre e answer
e
specify if relevant, otherwise
select the ‘Do not
know/cannot answer’ option)

If you answered “Disagree” or “Strongly disagree,” please specify:

6. Did you make use of any means of redress following the evaluation of your proposal
for [project name]?
Yes
No
Do not know/cannot answer/not relevant

6.1. Which means of redress did you make use of?


This question is about your specific experiences during the evaluation phase in [project name].
Please select one option.
This question will show up only for those who replied “Yes” in the question above.
Request for evaluation review by REA (the evaluation review procedure)
Request for legal review of the Agency decision by the Commission (‘Article 22 Request’ of
Council Regulation
Bringing action for annulment before the Court of Justice of the European Union
Do not know/cannot answer/not relevant

6.2. To what extent do you agree with the following statements concerning the means
of redress?
This question is about your specific experiences during the evaluation phase in [project name].
Please select one option in each row.
This question will show up only for those who replied “Yes” in question 6.
Strongl Rather Neither Rather Strongl Do not
y agree agree agree disagre y know /
nor e disagre cannot
disagre e answer
e
Information for applicants on
the different means of redress
was easy to find
The redress procedure was
clearly explained
The redress was conducted
according to the procedure
Other practicalities (please
specify if relevant, otherwise
select the ‘Do not
know/cannot answer’ option)

7. Do you have any further comments regarding the evaluation process? In particular,
what simplifications or service improvements would you suggest?

158
III. Grant finalisation phase

We would like to learn more about your experiences during the grant finalisation phase in
[project name].

8. To what extent do you agree with the following statements about the granting
process of [project name]?
Please select one option in each row.
Strongl Rather Neither Rather Strongl Do not
y agree agree agree disagre y know /
nor e disagre cannot
disagre e answer
e
The instructions provided by
REA at the beginning of the
granting process were clear
The REA staff assigned to my
project in the grant
finalisation and negotiation
phase were easily available
and responsive
Requests from REA (e.g. for
proposal modification or
providing missing
information) were clear
The electronic tools used in
the negotiation/contracting
process were user-friendly
Overall, the granting process
was transparent
Other practicalities (please
specify if relevant, otherwise
select the ‘Do not
know/cannot answer’ option)

9. Would you have any comments regarding the grant finalisation phase of REA
grants? In particular, what simplification and service improvement measures would
you suggest?

IV. REA Validation Services

We will now ask you some questions about REA Validation Services. This includes activities
related to the validation of participants, financial viability check assessments, as well as LEAR
validation.

10. Have you used REA Validation Services?


This question is about your specific experiences during the validation process in [project name].
Yes Go to 10.1.
No

159
Do not know/cannot answer

10.1. Have you used REA Validation Services for the following purposes:
This question is about your specific experiences during the validation process in [project name].
Please select one option in each row.
This question will show up only for those who replied “Yes” in the question above.
Yes No Do not
know/cannot
answer
The validation of a participant
The financial viability check
assessment
The Legal Entity Appointed
Representative’s (LEAR) extended
mandate validation

10.2. To what extent do you agree with the following statements about the validation
process in [project name]?
Please select one option in each row.
This question will show up only for those who replied “Yes” in question 10.
Strongl Rather Neither Rather Strongl Do not
y agree agree agree disagre y know /
nor e disagre cannot
disagre e answer
e
The electronic tools used for
the validation and
assessment of
beneficiaries were user-
friendly
The process of validating the
beneficiaries was smooth
and required reasonable
effort

11. Did you consult the information available on the Participant Portal for the
validation of participants?
This question is about your specific experiences during the validation process in [project name].
Yes Go to 11.1.
No
Do not know/cannot answer

11.1. Did you find the information available on the Participant Portal regarding the
validation of participants:
Please select one option in each row.
This question will show up only for those who replied “Yes” in the question above.
Strongl Rather Neither Rather Strongl Do not
y agree agree agree agree y know /
nor disagre cannot
disagre e answer
e
Easy to access
Clear
Concise

V. Project implementation

160
We would like to learn more about your experiences during the project implementation phase in
[project name].

12.0. Have you already submitted any periodic reports in [project name] to REA?
This question will show up only for ONGOING PROJECTS.
Yes
No
Do not know/cannot answer

12. Please assess the following aspects of the periodic reporting for [project name]:
Please select one option in each row.
This question will show up for all closed projects and for those ongoing projects who replied
“Yes” in the question above.
Strongl Rather Neither Rather Strongl Do not
y agree agree agree disagre y know /
nor e disagre cannot
disagre e answer
e

The information from REA


concerning administrative
requirements was clear

The REA staff assigned to my


project during the
implementation were easily
available and responsive

The feedback I received on


the progress of the content of
my project (e.g. in a mid-
term review) was useful

Technical reporting
requirements were clear

Financial reporting
requirements were clear

Overall, the periodic reporting


requirements were
appropriate to the level of
activities in my project

The electronic tools used for


managing my grant were
user-friendly

The process of monitoring my


project by REA was smooth

The process of monitoring my


project by REA was
transparent

Other practicalities (please


specify if relevant, otherwise
select the ‘Do not
know/cannot answer’ option)

13.0. Have you received any of the following payments in [project name] to date?
Please select one option in each row.
This question will show up only for ONGOING PROJECTS.
Yes No
161
Yes No
Pre-payment Go to 13.
Interim payment(s) Go to 13.
Final payment Go to 13.

13. To what extent do you agree with the following statements?


This question is about your specific experiences during the project implementation phase in
[project name]. Please select one option in each row.
This question will show up for all closed projects. In addition, for ongoing projects certain
options of this question will show up only for those who replied “Yes” in the respective options
of the question above.
Strong Rather Neithe Rather Strong Do not
ly agree r disagr ly know
agree agree ee disagr /
nor ee cannot
disagr answe
ee r

For pre-payment: the time it took


the Agency to make the payment
was appropriate
For interim payments: the time it
took the Agency to process
payment requests and make
payments was appropriate

For the final payment: the time it


took the Agency to process the
payment request and make
payment was appropriate

14. Did you undergo any grant amendment procedures in [project name]?
Yes Go to 14.1.
No
Do not know/cannot answer/not relevant

14.1. To what extent do you agree with the following statement regarding the grant
amendment procedures in [project name]?
Please select one option in each row.
This question will show up only for those who replied “Yes” in the question above.
Strong Rather Neithe Rather Strong Do not
ly agree r disagr ly know
agree agree ee disagr /
nor ee cannot
disagr answe
ee r

The REA staff assigned to my


grant amendment procedure were
easily available and responsive
The information and advice
provided by REA during the
amendment process was clear
The time it took REA to process
grant amendment requests was
appropriate

Overall, the grant amendment


process was smooth

162
Strong Rather Neithe Rather Strong Do not
ly agree r disagr ly know
agree agree ee disagr /
nor ee cannot
disagr answe
ee r

Other practicalities (please


specify if relevant, otherwise
select the ‘Do not know/cannot
answer’ option)

15. Would you have any comments regarding the grant implementation phase of REA
grants? In particular, what simplification and service improvement measures would
you suggest?

VI. REA’s communication and interaction with you

We will now ask you some questions about REA’s communication and interaction with you.
These questions are about your overall experiences with the Agency, i.e. if you benefited from
more than one contract, then please consider our overall performance of the Agency in this
area.

16. To what extent have the following communication channels used by REA provided
you with relevant, helpful information when you needed it?
Please select one option in each row.
To a To a To some To a Not at Does not
large moderat extent little all apply to
extent e extent extent me
Email contact
Telephone contact
Face-to-face
contacts
(meetings, events)
Recorded video
briefings
Live web briefings
(with a chat
function)
REA’s Facebook
profile
REA’s website
Other
communication
channels (please
specify if relevant,
otherwise select
the ‘Do not
know/cannot
answer’ option)

17. When preparing your application for [project name] or thereafter, did you contact
the Research Enquiry Service or National Contact Points?

163
Please select one option in each row.
Yes, Yes, Never, Never, Other/d
once more even because o not
than though I I was know /
once was not cannot
aware of aware of answer
this this
service service
Research Enquiry Service Go to Go to
17.1. 17.1.
National Contact Points Go to Go to
17.1. 17.1.

17.1. To what extent was the response from these services helpful?
Please select one option in each row.
Certain options of this question will show up only for those who replied “Yes, once” or “Yes,
more than once” in the respective options of the question above.
Very Helpful Neither Not Not Do not
helpful helpful very helpful know /
nor helpful at all cannot
unhelpf answer
ul
Research Enquiry Service
National Contact Points

18. Would you agree or disagree with the following statements regarding your
awareness on REA-managed calls/grants?
Please select one option in each row.
Strong Rather Neithe Rather Strong Do not
ly agree r disagr ly know
agree agree ee disagr /
nor ee cannot
disagr answe
ee r

Overall, funding opportunities for


REA-managed programmes are
well advertised
When applying for this grant, I
was aware that REA grants were
funded from EU budget
When applying for this grant, I
was aware that REA was
entrusted to manage its grants by
the European Commission

19. Do you have any further comments regarding REA’s communication and
interaction with you? In particular, what simplifications or service improvement
would you suggest?

VI. Overall performance of the Agency

164
This section contains the final set of questions about REA’s performance. Please consider your
overall experience with the Agency when answering them.

20. Overall, how satisfied are you with REA’s services?


Very satisfied
Satisfied
Neither satisfied nor dissatisfied
Dissatisfied
Very dissatisfied
Do not know/cannot answer

21. Would you consider applying for REA calls again?


Yes, certainly
Yes, maybe
No, probably not
No, certainly not
Do not know/cannot answer

21.1. What are the main reasons for indicating that you are unlikely to apply for REA
calls in the future?
Multiple answers are possible.
This question will show up only for those who replied “No, probably not” or “No, certainly not” in
the question above.
The eligibility requirements for proposals are too high
The success rate for REA-managed applications is too low
The application procedure is too complex
The administrative requirements for managing REA grants are too heavy
Other reasons (please specify)
Do not know/cannot answer

22. Do you have any final comments and suggestions for improvements in the quality
of REA’s services which have not been addressed in this questionnaire?

PPMI might launch a short follow-up interview programme in November or December on certain
topics related to REA’s performance and services provided between 2015 and 2018 which were
not covered in this survey. The short interviews will ask for more details on your specific
experiences with REA.

23. FINAL QUESTION: Would you agree to be contacted via email for more details on
your specific experiences with REA?
Yes Go to 23.1.
No

23.1. If you would like to be contacted at an email address different from the one
through which you received this survey, please enter the address below.
This question will show up only for those who replied “Yes” in the question above.

Thank you for your answers!

165
The questionnaire of the survey of unsuccessful REA applicants

Background information about the survey

Dear [contact(’first name’)][contact(’last name’),

The purpose of this survey, which is being conducted as part of the ongoing evaluation of the
Research Executive Agency, is to measure your satisfaction with the services provided by the
Agency, and to identify areas for improvement.

The survey questionnaire is designed for those H2020 applicants who were unsuccessful in their
applications for one or more grants managed by REA during 2015-2018.

The questionnaire covers the following aspects of your experiences with REA: application phase,
evaluation phase, REA's communication and interaction with you, and a section on the overall
performance of the Agency.

REA’s administrative data indicates that your organisation has applied for
[contact('organisation')], and that the outcome of this application was negative. When
answering the questions, please consider, to the extent possible, your specific experiences
during the application and evaluation processes for [contact('organisation')].

No personal information will be revealed publicly, in full compliance with our Specific Privacy
Statement on Personal Data Protection. Your data will be strictly anonymised, and only
anonymised data will be shared with REA and the European Commission.

Thank you in advance for your cooperation!

I agree to participate in this survey


Yes

II. Application phase

1. Overall, how did you find out about the funding opportunities offered by REA?
Multiple answers are possible.
Research Participant Portal
Annual Work Programmes
National Contact Point
Another national source (e.g. research ministry,
an EU liaison office at my university)
Recommendation by colleagues, superiors etc.
A European Commission website (e.g.
FP7/H2020 portal, REA website, CORDIS)
EC social media feed
Online advertising (e.g. ads in search engines or
social networking sites)
EU event or promotional material (e.g. an info
day, an EU info stand at a conference)
Research Enquiry Service
Media (e.g. a specialised magazine)
My work or invitation as an expert evaluator
Other (please specify)
Do not know/cannot answer

While submitting your proposal for [project name] you underwent a number of administrative
procedures. We would like to learn more about your experiences in this process.

2. Please assess the following aspects of the application process for [project name]:
This question is about your specific experiences during the application phase in [project name].
Please select one option in each row.
166
Strongl Rather Neither Rather Strongl Do not
y agree agree agree disagre y know /
nor e disagre cannot
disagre e answer
e
Information for applicants
(e.g. about the call objectives,
eligibility and selection
criteria, documentation
needed, etc.) was easy to
find
Information for applicants
(e.g. about the call objectives,
eligibility and selection
criteria, documentation
needed, etc.) was clear
I knew who to contact for any
question(s) I had or where to
get help when preparing my
application
I knew who to contact for any
question(s) I had or where to
get help when submitting
my application
The requirements for
application process (e.g. the
volume of proposal,
requirements for supporting
documents, etc.) were
reasonable and
proportionate
The proposal templates were
well structured and easy to
follow
The electronic tool used for
submitting the application was
user-friendly
Other practicalities (please
specify if relevant, otherwise
select the ‘Do not
know/cannot answer’ option)

3. How would you assess the overall timeliness of the following process of [project
name]?
This question is about your specific experiences in [project name].
Strongl Rather Neither Rather Strongl Do not
y agree agree agree disagre y know /
nor e disagre cannot
disagre e answer
e
The time period from the call
deadline to the time the
outcome of the proposal was
announced to you (i.e. time-
to-inform) was appropriate

167
4. Would you have any comments regarding the application phase of REA’s calls? In
particular, what simplification and service improvement measures would you
suggest?

III. Evaluation phase

We would like to learn more about your experiences during the evaluation phase in [project
name].

5. To what extent do you agree with the following statements referring to the expert
reviews and communication of the result of your proposal’s evaluation?
This question is about your specific experiences during the evaluation phase in [project name].
Please select one option in each row.
Strongl Rather Neither Rather Strongl Do not
y agree agree agree disagre y know /
nor e disagre cannot
disagre e answer
e
The results of my application
via the Participant Portal
were easy to access
The individual reviews and
panel comments provided
were clear
The individual reviews and
panel comments provided
were useful in
understanding the
strengths and weaknesses
of my proposal
Overall, the evaluation
process was transparent
Other practicalities (please
specify if relevant, otherwise
select the ‘Do not
know/cannot answer’ option)

If you answered “Disagree” or “Strongly disagree,” please specify:

6. Did you make use of any means of redress following the evaluation of your
proposal?
Yes
No

6.1. Which means of redress did you make use of?


This question will show up only for those who replied “Yes” in the question above.
Please select one option.
Request for evaluation review by REA (the evaluation review procedure)
168
Request for legal review of the Agency decision by the Commission (‘Article 22 Request’ of Council Regulation
Bringing action for annulment before the Court of Justice of the European Union
Do not know/cannot answer/not relevant

6.2. To what extent do you agree with the following statements concerning the means
of redress?
This question will show up only for those who replied “Yes” in question 6.
Please select one option in each row.
Strongl Rather Neither Rather Strongl Do not
y agree agree agree disagre y know /
nor e disagre cannot
disagre e answer
e
Information for applicants on
the different means of redress
was easy to find
The redress procedure was
clearly explained
The redress was conducted
according to the procedure
Other practicalities (please
specify if relevant, otherwise
select the ‘Do not
know/cannot answer’ option)

7. Do you have any further comments regarding the evaluation process? In particular,
what simplifications or service improvements would you suggest?

IV. REA’s communication and interaction with you

We will now ask you some questions about REA’s communication and interaction with you.
These questions are about your overall experiences with the Agency, i.e. if you have applied for
more than one project, then please consider our overall performance of the Agency in this area.

8. When preparing your application for [project name], did you contact the Research
Enquiry Service or National Contact Points?
Please select one option in each row.
Yes, Yes, Never, Never, Other/d
once more even because o not
than though I I was know /
once was not cannot
aware of aware of answer
this this
service service
Research Enquiry Service
National Contact Points

8.1. To what extent was the response from these services helpful?
Please select one option in each row.
This question will show up only for those who replied “Yes, once” or “Yes, more than once” in
the question above.

169
Very Helpful Neither Not Not Do not
helpful helpful very helpful know /
nor helpful at all cannot
unhelpf answer
ul
Research Enquiry Service
National Contact Points

9. Would you agree or disagree with the following statements regarding your
awareness of REA-managed calls/grants?
Please select one option in each row.

Strongly Rather Neither Rather Strongly Do not


agree agree agree disagree disagree know /
nor cannot
disagree answer
Overall, funding
opportunities for REA-
managed programmes
are well advertised
When applying for this
grant, I was aware that
REA grants were
funded from the EU
budget
When applying for this
grant, I was aware that
REA was entrusted to
manage its grants by
the European
Commission

10. Do you have any further comments regarding REA’s communication and
interaction with you? In particular, what simplifications or service improvement
would you suggest?

V. Overall performance of the Agency

This section contains the final set of questions about REA’s performance. Please consider your
overall experience with the Agency when answering them.

11. Overall, how satisfied are you with REA’s services?


Very satisfied
Satisfied
Neither satisfied nor dissatisfied
Dissatisfied
Very dissatisfied
Do not know/cannot answer

12. Would you consider applying for REA calls again?


Yes, certainly
Yes, maybe
No, probably not
No, certainly not
170
Do not know/cannot answer

12.1. What are the main reasons for indicating that you are unlikely to apply for REA
calls in the future?
Multiple answers are possible.
This question will show up only for those who replied “No, probably not” or “No, certainly not” in
the question above.
The eligibility requirements for proposals are too high
The success rate for REA managed applications is too low
The application procedure is too complex
The evaluation procedure is not transparent
Other reasons (please specify)
Do not know/cannot answer

13. Do you have any final comments and suggestions for improvements in the quality
of REA’s services which have not been addressed in this questionnaire?

Thank you for your answers!

171
The questionnaire of the survey of REA independent experts: monitoring experts &
evaluators

Background information about the survey

Dear [contact(’first name’)][contact(’last name’),

The purpose of this survey, which is being conducted as part of the ongoing evaluation of the
Research Executive Agency, is to measure your satisfaction with the services provided by the
Agency, and to identify areas for improvement.

The questionnaire contains questions about your overall experience with REA. If you evaluated
more than one proposal/monitored more than one action during 2015-2018, please consider
your overall experience with the Agency or choose an evaluation/action which you think best
describes your experience with REA.

No personal information will be revealed publicly, in full compliance with our Specific Privacy
Statement on Personal Data Protection. Your data will be strictly anonymised, and only
anonymised data will be shared with REA and the European Commission.

Thank you in advance for your cooperation!

I agree to participate in this survey


Yes

II. Registration/Creation of expert profile

The first group of questions concerns the registration/creation of your expert profile on the
Participant Portal. Please answer these questions if you registered/created profile during 2015-
2018. If you cannot answer these questions, choose the ‘Do not know/cannot answer’ option.

1. Where did you first hear about the opportunity to become an independent expert of
the Agency/Commission?
Please select up to three options.
By participating in the research project supported by FP7/H2020
Relevant national source (e.g. research ministry, an EU liaison office at my university)
Recommendation by colleagues, superiors in my institution
European Commission website
REA website
European Commission/REA social media feed
Online advertising (e.g. ads in search engines or social networking sites)
EU event or promotional material (e.g. an info day, an EU info stand at a conference)
Media (e.g. a video showcasing REA‘s evaluation facilities, a specialised magazine)
Other (please specify):
Do not know/cannot answer

2. Did you consult/use any of the online information which is available on the
Participant Portal before starting the registration process?
Yes Go to 2.1.
No
Do not know/cannot answer

2.1. Please assess the usefulness of the information provided online:


Please select one option in each row.
This question will show up only for those who replied ‘Yes’ in the question above.
Very Fairly Neither Not very Not at all Do not
useful useful useful useful know/ca

172
nnot
answer
/not
relevant
H2020 online manual
FAQ for experts
Reference documents
H2020 Helpdesk – the Research
Enquiry Service (RES)
IT Helpdesk
Other sources of information
(please specify if relevant,
otherwise select the ‘Do not
know/cannot answer/not
relevant’ option)

3. To what extent do you agree or disagree with the following statements concerning
your registration process:
Please select one option in each row.
Strongly Rather Neither Rather Strongly Do not
agree agree agree disagree disagree know/ca
nor nnot
disagree answer/
not
relevant
Information on how to become an
independent expert was easy to
find
Eligibility requirements to become
an expert were balanced
The selection procedures were
transparent
Overall, the registration process
was smooth
Other practicalities (please specify
if relevant, otherwise select the ‘Do
not know/cannot answer/not
relevant’ option)

4. Please provide your comments related to the registration process (if any):

III. Legal Entity/Bank Account Validation

Before you could be contracted as an expert, your identity and bank account details had to be
verified ("Legal Entity / Bank Account Validation"). This procedure is carried out separately from
the registration process.

5. Please assess the LE/BA validation procedure:


Please select one option in each row.

173
Strongly Rather Neither Rather Strongly Do not
agree agree agree nor disagree disagree know/
disagree cannot
answer
/not
relevant
It was clear to me why my
identity details and bank
account information needed
to be checked
Email notification that I
received asking me to
provide my identity and
bank account details was
clear
The IT tool in the Participant
Portal related to LE/BA
validation was easy to use

I had no difficulties in
providing the requested
supporting documentation
in relation to LE/BA
validation

I am confident that the


information that I provided
will be protected from
unauthorised access and use
I am satisfied with the
support provided by the
people who dealt with my
file
Overall, the LE/BA
validation process was
smooth

6. Please provide your comments related to the Legal Entity/Bank Account Validation
process (if any):
Please insert your comments in the box below.

IV. Selection and contracting

We will now ask you a series of questions on the selection and contracting procedures you
underwent during 2015-2018. If you evaluated more than one proposal/monitored more than
one action during 2015-2018, please consider your overall experience with REA or choose an
evaluation/action which you think best describes your experience with the Agency when
answering the following questions.

7. Were you contacted with regard to your availability to work as an independent


expert for a particular evaluation or project review before you received an expert
contract to sign?
Yes
174
No
Do not know/cannot answer/not relevant

8. Please give your assessment of the selection and contracting process:


Please select one option in each row.
Strongly Rather Neither Rather Strongly Do not
agree agree agree nor disagree disagree know/can
disagree not
answer/no
t relevant
The contract was issued
in sufficient time for me
to organise my
schedule/work
Email notification
informing me that a
contract has been sent to
me by the Commission
was clear
The contract sent to me
by the Commission was
easy to access and sign
Overall, the selection and
contracting process was
smooth

Experts are required to declare conflicts of interest (CoI), either at the time of signature of the
contract or as soon as CoI is identified.

9. Please assess the process in relation to CoI:


Please select one option in each row.
Strongly Rather Neither Rather Strongly Do not
agree agree agree nor disagree disagree know/can
disagree not
answer/no
t relevant
It is clear to me under
what circumstances I
should declare a CoI
The consequences of a
CoI are clear
I understand when and
how to declare a CoI

A PowerPoint presentation on the application of the confidentiality rules and processing of


personal data are provided to independent experts. The emphasis is on the issues of personal
data usage, protection and respect of confidentiality rules.

10. Please assess the process in relation to the processing of personal data and the
application of confidentiality rules:
Please select one option in each row.
Strongly Rather Neither Rather Strongly Do not
agree agree agree nor disagree disagree know/can
disagree not
answer/no
t relevant
It is clear to me what the
cases of possible or
actual misuse of personal
data are
I understand how
breaches of
175
confidentiality occur

The consequences of
misuse of personal data
and breach of
confidentiality are clear

V. Performance of the assignment/task

Now we will ask a few questions about your work before and during the evaluation/monitoring
process.

11. Which of the following functions did you have to carry out when contracted by
REA during 2015-2018?
Multiple answers are possible.
Evaluation of proposal(s)
Monitoring of project activities
Other (please specify)

11.1. To what extent do you agree or disagree with the following statements
concerning your work as an expert during the evaluation/monitoring process:
If you evaluated more than one proposal/monitored more than one action during 2015-2018,
please consider your overall experience with REA or choose an evaluation/action which you
think best describes your experience with the Agency when answering the following questions.
Please select one option in each row.
Strongly Rather Neither Rather Strongly Do not
agree agree agree disagree disagree know/ca
nor nnot
disagree answer/
not
relevant
Information provided by REA was clear
and sufficient (e.g. guides for evaluators,
guidance on how to use the electronic
evaluation system, briefing documents)
I was appropriately briefed on the
requirements for my work (esp. through
the telephone conference briefings)
It was clear to me how to evaluate and
rate proposals/monitor project
activities

Tasks I had to carry out were clearly


stated in the contract

11.2. To what extent do you agree or disagree with the following statements
concerning your work as an expert during the evaluation/monitoring process:
If you evaluated more than one proposal/monitored more than one action during 2015-2018,
please consider your overall experience with REA or choose an evaluation/action which you
think best describes your experience with the Agency when answering the following questions.
Please select one option in each row.
Some options of this question will only show up for those who selected 1 option in question 11
(evaluators).
Strongly Rather Neither Rather Strongly Do not
agree agree agree disagree disagree know/ca
nor nnot
disagree answer/
not
relevant

176
The electronic evaluation system was
easy to access

*This option will only show up for


evaluators
The electronic evaluation system was
easy to use

*This option will only show up for


evaluators
The templates for the reports I had to
complete were fit for purpose
Time available for drafting the reports
was appropriate
I knew either who to contact or where
to get help regarding any questions I
had when working on my tasks
The REA staff with whom I worked with
were responsive (e.g. by email or
phone)
The REA staff with whom I worked with
provided useful answers to my
questions
Overall, the evaluation/monitoring
process was smooth

12. Please provide your comments related to your work as an expert (if any):
Please insert your comments in the box below.

13. Did you carry out tasks at the Commission/Agency premises in Brussels during
2015-2018?
Please select one option.
Yes, I carried out my tasks at the Commission/Agency premises in Brussels (centrally) Go to
13.1.
No, I carried out my tasks at my own premises only (remotely) Go to 16.
I carried out my tasks at the Commission/Agency premises and at my own premises (centrally and Go to
remotely) 13.1.
Do not know/cannot answer

13.1. Please assess the extent to which you are satisfied with the following aspects of
REA evaluation/monitoring facilities in Brussels:
If you participated in the process of evaluation/monitoring for more than one project during
2015-2018, please consider your overall experience with REA or choose an evaluation/action
which you think best describes your experience with the Agency when answering the following
questions. Please select one option in each row.
This question will show up only for those who selected option 1 or 3 in the question above.
Some options of this question will only show up for those who selected 1 option in question 11
(evaluators).
Very Fairly Neither Not very Not at all Do not
satisfied satisfied satisfied satisfied know/can
not
answer/no
t relevant
Clarity of the
information provided on
how to
evaluate/interpret
evaluation criteria

177
Scoring system for
proposals

*This option will show


up only to evaluators
Overall service provided
at the reception desk
(i.e. access to
registration desk, badge,
indication of meeting
room)
Overall logistics and
infrastructure (i.e.
meeting rooms, working
spaces, refreshments
during meetings)
IT tools, internet
availability and
assistance provided by
IT staff
Practical information for
hotels and transport in
Brussels
Hotel
list/accommodation
availability
Catering Facilities
(cafeteria)
Overall level of
assistance and
responsiveness of the
REA staff during your
visit to Brussels

14. To what extent do you agree with the following statements regarding the
reimbursements of costs for your visit(s) to Brussels?
Please select one option in each row.
This question will show up only for those who selected option 1 or 3 in question 13.
Strongly Rather Neither Rather Strongly Do not
agree agree agree nor disagree disagree know/can
disagree not
answer/no
t relevant
Reimbursement forms
were provided on time
Reimbursement forms
were easy to complete
It was clear which travel-
related expenses were
not eligible for
reimbursement
The time it took REA to
process my
reimbursement forms
and process payments
was satisfactory
Overall, the
reimbursement of the
costs was smooth

14.2. Did you visit the Reimbursement Help Desk during your evaluation in Brussels?
This question will show up only for those who selected option 1 or 3 in question 14.
Yes Go to 14.3.
No
178
Do not know/cannot answer/not relevant

14.3. Please assess the quality of information/assistance provided by REA through the
Reimbursement Help Desk:
Please select one option in each row.
This question will show up only for those who replied ‘Yes’ in the question above.
Strongly Rather Neither Rather Strongly Do not
agree agree agree nor disagree disagree know /
disagree cannot
answer /
not
relevant
I received the
information/answer that I
needed
I am satisfied with the follow-up
of my query
The person who I spoke to was
responsive
The person who I spoke to was
qualified to assist me

When an expert needs to travel for an assignment, travel costs are reimbursed to and from the
expert's address as indicated in the contract (the "Point of Departure" – POD). An expert who
wishes to travel to and/or from a different address must request prior approval ("request for
change of point of departure").

15. Please assess the process in relation to requests to change the point of departure
for assignments involving travel:
Please select one option in each row.
Strongly Rather Neither Rather Strongly Do not
agree agree agree nor disagree disagree know/can
disagree not
answer/no
t relevant
I understand when and
how to request a change
of POD
Decisions on a change of
POD are made quickly

VI. Payment

The following questions are about the payments you received for your services as an
independent expert. If you evaluated more than one proposal/monitored more than one action
during 2015-2018, please consider your overall experience with REA or choose an
evaluation/action which you think best describes your experience with the Agency when
answering the following questions.

16. To what extent do you agree with the following statements concerning payments
for your services?
Please select one option in each row.

179
Strongl Rather Neither Rather Strongl Do not
y agree agree agree disagre y know/c
nor e disagre annot
disagre e answer
e /not
relevan
t
Email notification informing
me that I could prepare and
submit my payment request
was clear
Email notification requesting
me to provide further
information/documentation
supporting my payment
request was clear
Email notification informing
me that I had been paid was
clear
The time it took REA to
process my payment requests
and make payments was
satisfactory
The payment I received
matched the effort I had
spent on my tasks
Overall, I am satisfied with
the way that my payment
was handled

17. Please provide your comments related to the payment process (if any):
Please insert your comments in the box below.

18. Did you use any of the following communication channels to obtain support from
REA?
Please select one option in each row.
Yes No Do not
know/cannot
answer/not
relevant

REA Evaluation functional Go to 18.1.


mailbox
Briefings
Chat in COMPASS (if relevant,
otherwise choose the ‘Do not
know/cannot answer/not
relevant’ option)
Other (please specify if
relevant, otherwise choose
the ‘Do not know/cannot
180
Yes No Do not
know/cannot
answer/not
relevant

answer/not relevant’ option)

18.1. Please assess the quality of information/assistance provided by REA through the
REA Evaluation functional mailbox:
Please select one option in each row.
This question will show up only for those who replied ‘Yes’ for option 1 in question 18.
Strongly Rather Neither Rather Strongly Do not
agree agree agree nor disagree disagree know/can
disagree not
answer/no
t relevant
The response time was
appropriate
The response I received
provided me with the
information I needed
Overall, I am satisfied
with the service I
received via this mailbox

VII. Final questions

This section contains the final set of questions about REA’s performance. Please consider your
overall experience with the Agency when answering the following questions.

19. Would you like to work with REA as an independent expert again in the future?
Please select one option.
Yes, certainly
Yes, maybe
No, probably not Go to 19.1.
No, certainly not Go to 19.1.
Do not know / cannot answer

19.1. What are the main reasons for indicating that you are unlikely to work with REA
as an independent expert again in the future?
This question will show up only for those who replied ‘No, probably not' or 'No, certainly not' in
the question above.
The professional fee rate (currently set at EUR 450) is too low
The fee that I received does not reflect the amount of time I actually spent on the tasks
Registration procedures to become an expert are too complex
Administrative procedures related to working as an expert are too complex
The amount of the daily and/or accommodation allowance is too low
The reimbursement of my travel expenses was unreasonably low
Personal/ work reasons (e.g. changed area of work, too high workload)
Other reasons (please specify)
Do not know / cannot answer

20. Is there anything else you would like to add regarding your experience when
contracted by REA to work as an independent expert?
Please provide your comments in the box below.

181
PPMI might launch a short follow-up interview programme in November or December on certain
topics related to REA’s performance between 2015 and 2018 which were not covered in this
survey. The short interviews will ask for more details on your specific experiences with REA.

21. FINAL QUESTION Would you agree to be contacted via email for more details on
your specific experiences with REA?
Yes Go to 21.1.
No

21.1. If you would like to be contacted at an email address different from the one
through which you received this survey, please enter the address below.
This question will show up only for those who replied ‘Yes’ in the question above.

Thank you for your answers!

182
The questionnaire of the survey of EC officials

Background information about the survey

Dear [contact(’first name’)][contact(’last name’),

The purpose of this survey, which is being conducted as part of the ongoing evaluation of the
Research Executive Agency, is to measure your satisfaction with the services provided by the
Agency, and to identify areas for improvement.

The questionnaire consists of a series of questions about your overall experience with REA in
relation to the preparation of work programmes, evaluation of proposals under calls, policy
feedback and the overall performance of the Agency. The questionnaire asks questions about
your experiences with the Agency from mid-2015 to mid-2018. If you worked with the Agency
any time during this 3-year period, please answer the questions presented.

If you worked on multiple programmes implemented by the REA and/or liaised with various REA
officials during the evaluation period, please consider the experiences which you think best
summarise your perceptions of the Agency while answering the questions.

No personal information will be revealed publicly, in full compliance with our Specific Privacy
Statement on Personal Data Protection. Your data will be strictly anonymised, and only
anonymised data will be shared with DG RTD.

Thank you in advance for your cooperation!

I agree to participate in this survey


Yes

I. Background

1.What was your position in the Commission during 2015-2018 when you worked with
REA?
If you held more than one position in the Commission during 2015-2018, please select all the
options that apply.
Project officer or policy officer
Deputy Head of Unit or Head of Unit
Other (please specify)

1.1. Please specify your position in the Commission during 2015-2018 when you
worked with REA:
Please insert your comments in the box below.

2. Which REA units did you work with during 2015-2018?


Multiple answers are possible.
Unit A1 - Marie Skłodowska-Curie, Innovative
Training Networks
Unit A2 - Marie Skłodowska-Curie, Individual
Fellowships: European
Unit A3 - Marie Skłodowska-Curie, Research and
Innovation Staff Exchange
Unit A4 - Marie Skłodowska-Curie COFUND,
Researchers’ Night and Individual Fellowships:
Global
Unit A5 - Fostering Novel Ideas: FET-Open

183
Unit B1 – Space Research
Unit B2 – Sustainable Resources for Food
Security and Growth
Unit B3 – Inclusive, Innovative and reflective
Societies
Unit B4 – Safeguarding Secure Society
Unit B5 – Spreading Excellence, Widening
Participation, Science with and for Society
Other (please specify)
Do not know/Cannot answer/Not relevant

2.1. Please specify any other REA units you worked with during 2015-2018:
Please insert your comments in the box below.

II. Preparation of work programmes and research topics

We will now ask you a series of questions on your collaboration with REA during the preparation
of the work programmes and/or research topics in 2015-2018.

3. Did you liaise with REA officers during the preparation of the work programmes
and/or research topics in your area of responsibility during 2015-2018?
Please select one option.
Yes Go to 3.1, 3.2, 3.3
No

3.1. Which forms of communication did you use to liaise with REA officers during the
preparation of the work programmes and/or research topics?
This question will show up only for those who replied “Yes” in question 3.
Multiple answers are possible.
Email contact
Telephone contact/video calls
Face-to-face meetings
Other communication channels (please specify if
relevant)

3.2. Please specify any other forms of communication you used to liaise with REA
officers during the preparation of the work programmes and/or research topics:
Please insert your comments in the box below.

3.3. How frequently did you liaise with REA officers in this process?
Please select one option.
This question will show up only for those who replied “Yes” in question 3.
Only the options selected in question 3.1. will show up in this question.
Daily/al Weekly Every Monthly Quarterl Other
most two y frequen
daily weeks cy
(please
specify
if
relevant

184
)

Email contact
Telephone
contact/video calls
Face-to-face
meetings
Other communication
channels

3.4. Please specify the frequency with which you liaised with REA officers during the
preparation of the work programmes and/or research topics
Please insert your comments in the box below.

4. To what extent were REA’s inputs needed for the work programmes or research
topics that you/your unit oversaw during 2015-2018 regarding their implementation?
Please select one option
.
To a large extent Go to 4.1
To a moderate extent Go to 4.1
To some extent Go to 4.1
To a little extent
Not at all
Do not know/ Cannot answer/ Not applicable

4.1. To what extent do you agree with the following statements regarding the inputs
provided to you or your unit by REA?
This question will show up only for those who replied, “To a large extent,” “To a moderate
extent” or “To some extent” in question 4.
Please select one option in each row.
To a To a To some To a Not at Do not
large moderat extent little all know/
extent e extent extent Cannot
answer
/ Not
applicab
le
REA provided me/my unit
with relevant inputs for
the programme
priorities/research topics
under my/my unit’s
responsibility
The inputs provided by REA
were timely
The inputs provided by REA
were of high quality
The inputs provided by REA
were directly used in the
implementation of the
preparation of the work
programme/research topics
Other practicalities (please
specify if relevant,
otherwise select the ‘Do
185
To a To a To some To a Not at Do not
large moderat extent little all know/
extent e extent extent Cannot
answer
/ Not
applicab
le
not know/ Cannot answer/
Not applicable’ option)

4.2. Please specify any other practicalities regarding the inputs provided to you or
your unit at the Commission by REA:
Please insert your comments in the box below.

4.3. Could you provide any concrete examples of how REA's work has benefited
directly the preparation of the work implementation of programmes/research topics?
Please insert your comments in the box below.

III. Evaluation of proposals under calls and/or research topics

We would like to learn more about your experiences with REA during the evaluation of proposals
under calls and/or research topics in 2015-2018.

5. Did your DG/you provide briefings to REA staff before the start of evaluation
activities for your calls in 2015-2018?
Please select one option.
Yes Go to 5.1
No
Do not know/ Cannot answer/ Not applicable

5.1. To extent do you agree with the following statements regarding REA staff?
This question will show up only for those who replied “Yes” in question 5.
Please select one option in each row.
To a To a To some To a Not at Do not
large moderat extent little all know/
extent e extent extent Cannot
answer
/ Not
applicab
le
REA staff were attentive
to your DGs/your briefings
REA staff actively
participated in your
DGs/your briefing sessions
REA staff took your
DGs/your briefings into
account while organising
evaluations

6. In your opinion, were you sufficiently informed by REA about the progress of the
evaluations under your calls or research topics?
186
This question concerns the overall evaluation process implemented by REA, and the extent to
which the information/updates you received was/were sufficient. By ‘sufficient’ information
about the progress of the evaluations we mean that the communication by your REA
counterpart(s) was a) proactive; b) regular; c) informative.
Please select one option.
Yes, to a large extent
Yes, to a moderate extent
Yes, to some extent
Yes, to a little extent
No, I was not sufficiently informed by REA about the
progress of evaluations in my programme and/or
research topics
Do not know/ Cannot answer/ Not applicable

6.1. Please specify why you think REA did not sufficiently inform you about the
progress of the evaluation process (e.g. in terms of proactiveness, regularity or
informativeness of its communication):
Please insert your comments in the box below.
This question will show up only for those who replied “Yes, to a little extent” or “No, I was not
sufficiently informed by REA” in question 6.

7. Were you invited to the REA briefings for experts under your calls or research
topics before the start of evaluation activities in 2015-2018?
Please select one option.
Yes, always
Yes, most of the time
Yes, sometimes
No, generally I was not invited to the briefings
Do not know/ Cannot answer/ Not applicable

8. Did you attend the REA briefings for experts under your calls or research topics
before the start of evaluation activities in 2015-2018?
Please select one option.
Yes, always
Yes, most of the time
Yes, sometimes
No, generally I did not attend the briefings
Do not know/ Cannot answer/ Not applicable

9. Did you take part in any consensus meetings organised by REA for the calls or
research topics under your responsibility during 2015-2018?
Please select one option.
Yes Go to 9.1.
No
Do not know/ Cannot answer/ Not applicable

9.1. If applicable, to what extent do you agree with the following statements
regarding the consensus meetings in which you participated?
Please select one option in each row.
This question will show up only for those who replied “Yes” in question 9.
To a To a To some To a Not at Do not
large moderat extent little all know /
extent e extent extent Cannot
answer
Meetings were well
organised
Meetings were effective in
achieving a consensus
Other (please specify)

187
9.2. Please specify other practicalities regarding the consensus meeting in which you
participated:
Please insert your comments in the box below.

10. In your opinion, did REA implement a process which ensures that the proposals
best addressing the specific research topics under your area of responsibility were
selected for funding, as defined in the H2020 Rules for Participation and further detailed in
the Vademecum?
Please select one option.
Yes, always
Yes, most of the time
Yes, sometimes
No, REA overall did not implement a process which
ensures that the proposals best addressing the
specific research topics under your area of
responsibility were selected for funding, as defined
in the H2020 Rules for Participation and further
detailed in the Vademecum (please specify further)
Do not know/ Cannot answer/ Not applicable

10.1. Please specify why you think REA did not implement a process which ensures
that the proposals best addressing the specific research topics were selected for
funding:
Please insert your comments in the box below.

11. Overall, to what extent are you satisfied with the following aspects of the
evaluation process organised by REA?
This question concerns you overall satisfaction with the evaluation procedures implemented by
REA in your calls or research topics. Please select one option in each row.
To a To a To some To a Not at Do not
large moderat extent little all know/
extent e extent extent Cannot
answer
/ Not
applicab
le
Quality of evaluations was
appropriate
Timeliness of evaluations
was appropriate
The evaluation process was
transparent
Other practicalities (please
specify if relevant,
otherwise select the ‘Do
not know/cannot answer’
option)

11.1. Please specify any other practicalities regarding the evaluation process
organised by REA:
Please insert your comments in the box below.

188
IV. Policy feedback and knowledge management at REA

We will now ask you some questions about the policy feedback provided to you by REA since
2015.

12. Beyond regular meetings, which types of policy feedback outputs did REA provide
to you or your unit at the Commission during 2015-2018?
Please select one option in each row.
Yes, REA provided No, REA provided no Do not know/
such policy such policy Cannot answer/ Not
feedback outputs to feedback outputs to applicable
me or my unit at the me or my unit at the
Commission during Commission during
2015-2018 2015-2018
Cluster meetings with
Parent DGs and other
stakeholders (NCP,
etc.)
Coordination meeting
with Project Officer /
Policy Officer
Contributions from
REA to the preparation
of policy reports
Participation of REA in
thematic events
(specific policy
related) organised by
the EC
Written feedback from
REA to Parent DGs on
the WP
implementation
Reporting to
Programme
Committee and/or
Advisory Groups
Project kick-off/review
meeting with Policy
Officers attending
Innovation Radar
Input on REA projects
for EC communication
and dissemination
Collecting and giving
feedback about
researchers' needs
and satisfaction
Follow-up on success
stories
Other types of policy
feedback outputs
(please specify)

12.1. Please specify other types of key policy feedback outputs REA provided to you or
your unit at the Commission during 2015-2018:
Please insert your comments in the box below.

189
12.1 Were these policy feedback outputs produced on a regular or ad hoc basis?
This question will show up only for those who replied ‘Yes’ to any of the options in question 12.
Only the options selected in question 12 will show up in this question.
Regular Ad hoc Do not
policy policy know/
feedback feedback Cannot
output (i.e. output (i.e. answer/ Not
mostly mostly applicable
regularly irregularly
occurring occurring
and/or and/or not
scheduled scheduled
output) output)
Cluster meetings with Parent DGs and other
stakeholders (NCP, etc.)
Coordination meeting Project Officer /Policy
Officer
Contributions from REA to the preparation of
policy reports
Participation of REA in thematic events (specific
policy related) organised by the EC
Written feedback from REA to Parent DGs on the
WP implementation
Reporting to Programme Committee and/or
Advisory Groups
Project's kick-off/Review meeting with Policy
Officers attending
Innovation Radar
Input on REA projects for EC communication
and dissemination
Collecting and giving feedback about
researchers' needs and satisfaction
Follow-up on success stories
Other types of policy feedback outputs

12.2 How satisfied were you with the frequency of REA’s policy feedback outputs
produced during 2015-2018?
We are interested to learn if the frequency of REA’s outputs was satisfactory during 2015-2018, or
if any of the outputs could be produced more (or less) frequently
This question will show up only for those who replied ‘Yes’ to any of the options in question 12.
Only the options selected in question 12 will show up in this question.
Very Rather Neither Rather Very Do not
satisfied satisfied satisfied dissatisfied dissatisfied know/
nor Cannot
dissatisfied answer/
Not
applicable
Cluster meetings
with Parent DGs and
other stakeholders
(NCP, etc.)
Coordination
meeting with Project
Officer / Policy
Officer
Contributions from
REA to the
preparation of policy
reports
190
Very Rather Neither Rather Very Do not
satisfied satisfied satisfied dissatisfied dissatisfied know/
nor Cannot
dissatisfied answer/
Not
applicable
Participation of REA
in thematic events
(specific policy
related) organised
by the EC
Written feedback
from REA to Parent
DGs on the WP
implementation
Reporting to
Programme
Committee and/or
Advisory Groups
Project kick-
off/review meeting
with Policy Officers
attending
Innovation Radar
Input on REA
projects for EC
communication and
dissemination
Collecting and giving
feedback about
researchers' needs
and satisfaction
Follow-up on success
stories
Other types of policy
feedback outputs

12.3. Please specify your dissatisfaction with the frequency of REA’s policy feedback
outputs produced during 2015-2018:
Please insert your comments in the box below.

12.4. How satisfied were you with the quality of the policy feedback outputs produced
by REA?
This question will show up only for those who replied ‘Yes’ to any of the options in question 12.
Only the options selected in question 12 will show up in this question.
Very Rather Neither Rather Very Do not
satisfied satisfied satisfied dissatisfied dissatisfied know/
nor Cannot
dissatisfied answer/
Not
applicable
Cluster meetings
with Parent DGs and
other stakeholders
(NCP, etc.)
Coordination
meeting with Project
Officer / Policy
Officer
Contributions from
191
Very Rather Neither Rather Very Do not
satisfied satisfied satisfied dissatisfied dissatisfied know/
nor Cannot
dissatisfied answer/
Not
applicable
REA to the
preparation of policy
reports
Participation of REA
in thematic events
(specific policy
related) organised
by the EC
Written feedback
from REA to Parent
DGs on the WP
implementation
Reporting to
Programme
Committee and/or
Advisory Groups
Project kick-
off/review meeting
with Policy Officers
attending
Innovation Radar
Input on REA
projects for EC
communication and
dissemination
Collecting and giving
feedback about
researchers' needs
and satisfaction
Follow-up on success
stories
Other types of policy
feedback outputs

12.5. Please specify your dissatisfaction with the quality of REA’s policy feedback
outputs produced during 2015-2018:
Please insert your comments in the box below.

12.6. To what extent did you use these REA outputs to inform your policymaking
tasks?
Please select one option in each row.
This question will show up only for those who replied ‘Yes’ to any of the options in question 12.
Only the options selected in question 12 will show up in this question.
To a To a To some To a Not at Do not
large moderat extent little all know/
extent e extent extent Cannot
answer
/Not
relevant
Cluster meetings with Parent DGs
and other stakeholders (NCP, etc.)
Coordination meeting with Project
Officer / Policy Officer
Contributions from REA to the
192
To a To a To some To a Not at Do not
large moderat extent little all know/
extent e extent extent Cannot
answer
/Not
relevant
preparation of policy reports
Participation of REA in thematic
events (specific policy related)
organised by the EC
Written feedback from REA to Parent
DGs on the WP implementation
Reporting to Programme Committee
and/or Advisory Groups
Project's kick-off/review meeting
with Policy Officers attending
Innovation Radar
Input on REA projects for EC
communication and dissemination
Collecting and giving feedback about
researchers' needs and satisfaction
Follow-up on success stories
Other types of policy feedback
outputs

13. Could you provide any concrete (good) examples of how REA’s inputs fed into
your policymaking tasks?
Please insert your comments in the box below.

13.1 Could you name any policy feedback output(s) which are currently missing/REA
does not provide them to you?
Please insert your comments in the box below.

14. Were you/your DG interested in attending any project review meetings for
projects funded under your programme or research topics during 2015-2018?
Please select on option.
Yes, all or almost all of the relevant project review meetings (i.e. for projects
funded under my/my DGs programme or research topics)
Yes, most of the relevant project review meetings
Yes, some of the relevant project meetings
No, generally I/my DG were not interested in attending any relevant project
review meetings organised by REA
Do not know/ Cannot answer/Not relevant

15. Did REA share information with you/your DG about the scheduled project review
meetings in your programme or research topics?
Please select one option.
Yes, always (i.e. for all/virtually all projects Go to 15.1
funded under my programme or research topics
Yes, most of the time Go to 15.1
Yes, sometimes Go to 15.1
No, REA generally did not share information Go to 15.1
about scheduled project review meetings in my
programme or research topics
Do not know/ Cannot answer/Not relevant Go to 15.1

193
15.1 Were your DG/you invited by REA to attend the project review meetings for
projects funded under your programme or research topics?
Typically, such invitations would take place when an EC official signals his/her willingness to
attend the review meetings; it is also possible that REA invites EC officials to the meetings even
if they do not explicitly signal their interest in a particular project review meeting. Please select
one option.
Yes, always Go to 15.1, 15.2, 15.3
Yes, most of the time Go to 15.1, 15.2, 15.3
Yes, sometimes Go to 15.1, 15.2, 15.3
No, REA generally did not invite me to attend
any project review meetings that I was
interested in
Do not know/ Cannot answer/Not relevant

15.2. Which types of project review meetings were you invited to attend?
Multiple answers are possible.
This question will show up only for those who replied ‘Yes, always,’ ‘Yes, most of the time’ or
‘Yes, sometimes’ in question 15.1
Review meetings for single projects
Clustered project meetings
Other (please specify)
Do not know/ Cannot answer/Not relevant

15.3. Please specify any other types of project review meetings you were invited to
attend:
Please insert your comments in the box below.

15.4. To what extent do you agree with the following statements regarding the
project review meetings you attended?
Please select one option in each row.
This question will show up only for those who replied ‘Review meetings for single projects,’
‘Clustered project meetings” or ‘Other (please specify)’ in question 15.2
To a To a To some To a Not at Do not
large moderat extent little all know/
extent e extent extent Cannot
answer
/Not
relevant
Project review meetings
were well organised
The meetings maintained a
good balance between
administrative and
content-based issues
Project review meetings
provided me with useful
inputs into my
policymaking tasks
Other practicalities (please
specify if relevant,
otherwise select the ‘Do not
know/cannot answer/not
relevant’ option)

15.5. Please specify any other practicalities regarding the project review meetings
you attended:
Please insert your comments in the box below.

194
V. General questions

This section contains the final set of questions about REA’s performance. Please consider your
overall experience with the Agency when answering them.

16. Overall, how satisfied are you with REA’s work?


Please select one option.
Very satisfied
Satisfied
Neither satisfied nor dissatisfied
Dissatisfied
Very dissatisfied
Do not know/ Cannot answer/Not relevant

17. To what extent do you agree or disagree with the following statements?
Please select one option in each row.
Strongl Rather Neither Rather Strongl Do not
y agree agree agree disagre y know/
nor e disagre Cannot
disagre e answer
e /Not
relevan
t
REA effectively and efficiently
implements the programmes
delegated to it under my
portfolio of activities

REA enables me to focus


entirely on my policymaking
tasks
I am not involved in any
activities delegated to REA
which could be regarded as
programme implementation
tasks
REA is proactive in its daily
communication with me
REA provides me with
sufficient policy feedback to
inform my policymaking tasks
I have a good working
relationship with my REA
counterpart(s) at
interpersonal level
I actively use the inputs
provided by REA in my
policymaking tasks

18. Do you have any final comments and suggestions for improvements in the quality
of REA’s performance which have not been addressed in this questionnaire?

Thank you for your answers!

195
Annex 5: In-depth Study Areas

Assessment of REA’s coherence, separation of tasks/roles between the Commission and


the Agency, as well as maintenance of the know-how within the Commission

This study area provides an in-depth analysis of REA’s knowledge management (KM) and
Policy Feedback Function (PFF). In particular, it considers the context in which KM
activities have taken place, reviews the specific initiatives and measures that have been
adopted by REA to strengthen policy feedback, analyses key achievements and the
extent of impacts to date, and identifies the remaining challenges. Progress made in the
2015-2018 period of operations is also compared with the situation in the previous 2012-
2015 period of operations.

The study was undertaken through a combination of desk research and interviews with
REA management and officials from the parent DGs. In addition, the results from Survey
C with Commission Officials were taken into account, as they are a valuable source of
information about the initial effects of the new Policy Feedback reporting options that
started to be implemented during the evaluation period.

Conceptual issues and different types of policy feedback

Conceptually, a distinction can be made between REA’s direct and indirect


contribution to informing policymaking through catalysing knowledge generated by
REA during the implementation of its delegated programmatic responsibilities. Direct
support to policymaking includes specific measures and initiatives such as the
development of a Policy Options Catalogue and supporting guidance, and the setting up
of a rapid reaction network to respond to requests for support from policymaking DGs.
However, REA has also contributed to policymaking indirectly. For instance, REA provides
data and information about research results from H2020 funded projects, which feed into
different monitoring and reporting tools, and external communication tools, such as the
Innovation Radar.

It is also important to note that policy feedback can take different forms, such as:
the extraction of policy-relevant data and information about projects, including factual
information about projects, monitoring and reporting information pertaining to research
results and achievements, either from individual or groups of projects. Policy feedback
can also take more structured, analytical forms, such as policy reports or briefing
documents, and the policy factsheets that can be produced through the AGILE-Rapid
Reaction Network (described later on in the study area). It should be kept in mind that
the different initiatives and tools described also reflect that different types of initiatives
and measures may be required, depending on the DG, the policy area, the urgency with
which policy input is needed. Taking a simple example, there is a major difference
between a policy factsheet requested at short notice and a detailed policy report that
draws on research results across many FP projects as the evidence base.

Differences in the types of policy feedback required, and the format in which these are
generated therefore need to be considered when assessing the specific initiatives and
tools put in place by REA, since these have different resourcing and capacity implications
for REA to address the different parent DG’s needs satisfactorily.

Baseline situation at the beginning of the 2015-2018 period of operations and


current challenges

A number of shortcomings relating to the effectiveness of REA support to policymaking


were identified in the previous evaluation of the Agency’s operations in the 2012-2015
period. While REA was found to have provided substantial amounts of data and
information to the parent DGs of the Commission through different reporting

196
mechanisms, according to some interviewees, the Commission still lacked information on
the content of implemented projects126.

The evaluation report also recognised the complexity for REA of providing policy-relevant
feedback to all parent DGs, given that the number of parent DGs increased from four to
six between the 2007-2013 and 2014-2020 programming periods. It was noted that due
to the variety of programmes being administered by REA (including both top-down and
bottom-up research programmes), it would not be effective to implement a ‘one size fits
all’ approach to the provision of policy feedback and knowledge.

Nevertheless, the need for a common formal methodology was stressed, supported by
common practices and more systematic reporting, in order to ensure that knowledge
acquired by REA is translated into the provision of effective policy feedback. The lack of a
systematic approach was identified as one of the main weaknesses in respect of REA’s
PFF in 2012-2015, requiring further efforts to respond to the expectations of REA’s
different parent DGs. One of the central recommendations of the previous REA evaluation
was to consolidate structured dialogue between the Agency and parent DGs to improve
the feedback of project-related information into policymaking.

Some of the findings from the earlier evaluations need to be set in context. In REA’s first
mandate, the provision of policy feedback was not considered to be a significant issue,
since REA was mainly responsible for managing bottom-up research programmes open to
all areas of science, such as the MSCA fellowships grants for researchers’ mobility which
were multi-disciplinary. Implementation required managing high-volume grants and the
DGs concerned were mainly interested in REA fulfilling its project implementation
effectively and in statistics on mobility.

However, during the course of the current period with more top-down programmes being
delegated to REA, expectations have increased that REA would play a stronger role in
assisting the EC in extracting relevant policy information. However, this has created an
expectation gap, since in planning for its second programming mandate in 2014-2020,
REA noted that in the 2013 CBA, no specific financial provision was made for the
provision of policy feedback.

A further problem has been that different DGs have had differing expectations as to
REA’s role in analysing thematically linked groups of projects. Both REA and the parent
DGs recognised in interviews that there were differences in expectations regarding the
extent and nature of REA’s role in the extraction of policy-relevant information from
projects supported and in particular, knowledge transfer from research results.

There were also challenges in managing expectations as to how far EC officials, as


opposed to REA, were responsible for extracting thematic information about projects
made accessible by REA in database form. In REA’s first mandate (2009-2013), the
intention was that REA would store all relevant information electronically in database
form, but the EC was expected to use their own tools and data mining approach to
extract relevant policy information. This has created uncertainty during its second
mandate in the current 2014-2020 period as to whether EC officials should be actively
provided with data analysed thematically, or whether this should be in response to
specific data queries sent on request, or whether the policy officials should access this
information themselves, since the information was designed by REA to be readily
accessible to the EC.

A barrier identified to EC policy officers utilising these database tools however, was that
the parent DGs were not in REA’s view investing enough time to develop their skills to
access management information about projects that is available across REA’s portfolio.
According to REA staff, Excel queries could be generated by policy officials; however,

126
PPMI (2016). Evaluation of the Operation of REA (2012-2015). Final Report.
197
they often asked the Agency to generate queries on their behalf. This view was shared by
some EC officials.

Context - Knowledge management and policy feedback

An assessment was carried out on the extent to which the Agency’s regulatory
framework provides sufficient clarity as to REA’s policy feedback responsibilities. In
common with other EAs, the description of the tasks and responsibilities of REA and the
parent DGs at the Commission services respectively was clear in the Delegated Act and
the Memorandum of Understanding. However, there was some ambiguity as to the
delineation of tasks between the Agency and parent DGs in the Commission services in
terms of REA’s role in informing policymaking. While it was explicitly clear that the parent
DGs in the Commission services are solely responsible for policymaking, there remained
a question as to how far REA should focus on its PFF, as opposed to its role in fulfilling its
core delegated programme implementation and its administrative and logistical support
responsibilities.

Recital 3 of REA`s Establishment Act for the 2014-2020 period (EC Decision
2013/778/EU) stated that the delegation of tasks related to programme implementation
to an Executive Agency requires a clear separation between the programming stages
involving a large measure of discretion in making choices driven by policy considerations,
this being carried out by the Commission, and programme implementation, which should
be entrusted to the Executive Agency. The absence of an explicit task to provide policy-
relevant information to parent DGs in the regulatory framework meant that there was a
lack of clarity as regards how far REA was responsible for the provision of policy-relevant
information to its parent DGs in the 2014-2020 programming period, and how far EC
officials ought to be responsible for extracting such information.

In spite of an explicit remit in its regulatory framework, given the close proximity of REA
to H2020 beneficiaries, and the significant number of sub-programmes the Agency was
responsible for implementing, it was commonly understood that the parent DGs might
reasonably expect that REA should assist them in extracting policy-relevant information
from the implementation of projects, including informing policymaking based on research
results. Accordingly, REA already provided support to EU policymaking across its parent
DGs, and a set of reporting and dissemination mechanisms have already been put in
place, partly to address some of the weaknesses identified in the previous 2012-2015
evaluation. It was acknowledged, however, that REA’s role in informing policy through
the extraction of research results and policy feedback is not yet supported effectively by
established business processes.

Measures undertaken at a strategic and regulatory level

Since the publication of the previous evaluation findings covering the 2012-2015 period
of operations in 2016, several specific initiatives and measures have been undertaken by
the Agency during the 2015-2018 period of operations to strengthen REA’s Policy
Feedback Function and improve the effectiveness of mechanisms to capture knowledge
and provide timely and more customised policy feedback to parent DGs. Various specific
initiatives have been launched or implemented, with actions taken to improve structured
dialogue and to develop mechanisms to ensure that further and more relevant policy
feedback is provided to parent DGs.

In 2016, for instance, an update was made to the MoU, which established a set of
procedures for ensuring more structured dialogue. Policy feedback was also introduced as
a central topic for the quarterly coordination meetings between REA and the relevant
Commission policy services. REA has also undertaken internal reflections to strengthen
policy feedback through the development of a document on Strengthening Policy
Feedback in REA Mapping Insights and Recommendations. This initiative was
developed by the Project Monitoring and Policy Feedback Task Force (PFTF), which
operated from 2016 to 2018. The PFTF was operational during most of the evaluation
198
period and was one of the important initiatives by the Agency in terms of policy
feedback.
This demonstrated that REA is committed organisationally to improving its capacity to
provide coherent and effective policy feedback through improved knowledge
management, and to documenting good practices in the definition and implementation of
policy options.

Various challenges were identified by the PFTF in designing the subsequent guidance
document on policy feedback developed by the Agency, including the following:

 Responding to ad hoc requests from policymakers was not considered to be viable


(this has subsequently been addressed, however through the setting up of the AGILE-
Rapid Reaction Network, through which policy factsheets can be requested).

 There is no proper stocktaking via policy feedback reviews.

 Policy officers are not always trained on how to utilise policy feedback provided.

 Practices relating to whether invitations are sent by REA to policy officers from the
Commission to attend relevant project (review) meetings vary. Not all policy officers
are invited, even though there is strong demand to attend such meetings, according to
the results of Survey C. The participation rate of those invited to attend meetings is
consequently low, which is a missed opportunity to strengthen policy feedback.

 It is not currently possible to flag, within the IT system for monitoring ongoing
research projects, those projects that are considered to be especially relevant to
policymaking.

However, some of these challenges have also already been partially addressed through
the design of new initiatives and measures to strengthen policy feedback.

The guidance on Policy Feedback (PF) explains that PF is a "twofold process that
implies contributions from the operational level in support of: i) policy implementation
and ii) policy development (including new policies or revision of current policies). This
proactive process is based on a trust-based relationship with parent DGs that ensures a
clear understanding of the relevant policy framework and political priorities. The ultimate
goal is to support the EC to ensure the positive impact of knowledge on policies.” In
order to support the implementation of this approach, REA has created a Policy Feedback
Structured Mechanism, which is based on three pillars. Through the first pillar, REA
provides support to beneficiaries and derives policy messages from interactions with
beneficiaries through project monitoring. Through the second pillar, REA fulfils its core
policy feedback function, taking into account the categorisation of policy measures
included in the PF guide and supporting list of 60 measures. The third pillar is the
AGILE-Rapid Reaction Network, which is a mechanism to provide rapid responses to
pressing policy needs of parent DGs through the development of a policy factsheet in
response to requests from the parent DG (see a study area on KM). The diagram below
shows the three pillars of the mechanism.

199
Figure 38. Structured Mechanisms for Policy Feedback.
Source: compiled by CSES based on REA’s Policy Feedback Factsheet.

The structured policy feedback mechanism described above is based on the following
guiding principles: 1. Creating value 2. Maintaining flexibility in managing the
relationships between REA and its parent DGs and 3. Becoming agile.

Under each pillar, mechanisms have been put in place for REA to capitalise on knowledge
created through a structured mechanism. In order to support FP beneficiaries in fully
exploiting the research results from their projects and to better contribute to
policymaking, REA provides support to encourage projects to have an interface with
policy DGs and to apply to initiatives such as the Common Dissemination
Booster/Common Exploitation Booster/European Investment Project Portal (EIPP). It also
encourages:

 Open Science & long-term sustainability of data;

 Networking between projects (communication platform), production of joint


recommendations; and

 Requires minimum mandatory Policy-Relevant Information from projects such as


''Project-to-Policy Round tables", and ''Policy Papers''.

In 2017, to address weaknesses identified in the 2012-2015 evaluation, a more


structured approach was put in place to foster such dialogue. REA developed an internal
strategy, namely the Project Monitoring and Policy Feedback Task Force, which was
established by the REA Network of Project Officers “NPO.” Its aim was to map, analyse
and develop recommendations to strengthen the effectiveness and smooth operation of
REA’s Policy Feedback Function. The PFTF arrived at the conclusion that while REA’s
Policy Feedback Function has been proactive in complying with the tasks required of it,
REA lacks a common formal methodology and well-established common practices in the
provision of such feedback. The main recommendations made encompassed all stages of
the policy feedback reporting cycle: the need for a better definition of “policy feedback,”
the identification of relevant stakeholders so as to better customise policy feedback to
the stakeholders concerned, better structuring of REA’s PFF, better communications, as
well as the need for improvements to be made to better capitalise on data and
information e.g. about projects being implemented, the exploitation and dissemination of
research results, etc.).

The overarching nature of the internal strategy was translated into a detailed document –
the Catalogue of Policy Options, which demonstrates REA’s commitment to further
operationalise the PFF. The catalogue includes the categorisation of different alternative
options into policy pull and push measures. A distinction was made between:

 Policy support measures, such as encouraging Open Science and the long-term
sustainability of data and networking between projects through the creation of a
communication platform and setting up the AGILE-Rapid Reaction Network;

 Organisational measures, such as sharing folders with common IT platforms


containing project-related information;
200
 Communications-related measures, such as the development of a newsletter to be
shared with the Parent DG and the creation of a communications plan for the PF
concerned;

 IT-related measures, such as tools to facilitate IT analysis for programmes and the
training of POs;

 Policy-development measures such as REA’s involvement in the preparation of


policy reports, and

 Proactive management tools, such as organising thematic cluster events linked to


policymaking.

The various initiatives mentioned above demonstrated REA’s organisational commitment


to improving its capacity to provide coherent and effective policy feedback by
strengthening business processes relating to knowledge management and through the
development of specific tools and instruments.

Implementation measures undertaken to strengthen policy feedback

A further positive development was that implementation measures to strengthen policy


feedback were also undertaken during the evaluation period. A multi-faceted cooperation
and policy feedback mechanism was developed at the level of each delegated specific
programme within Horizon 2020. Commission policy officials were also given the
opportunity to directly retrieve policy messages, since they were invited to attend kick-
off and review meetings. Policy Feedback actions relevant to each specific programme
were also presented by REA in the bi-annual coordination meetings with its parent DGs.

The preparation of a detailed list of options for policy feedback exchange and the
additional implementation measures described above are both steps in the right
direction. Nevertheless, in order to establish a set of operational working methods for
working together with parent DGs, it is important that the policy reporting options should
be tested over a longer period of time. At this stage, it is still relatively early in the
implementation of the new Policy Options framework to evaluate key achievements, since
insufficient time has passed to evaluate some of the new tools.

It was suggested by some interviewees that it would be useful to limit the overall number
of suggested policy feedback options to a shorter list of effective operational practices.
These should be customised in order to reflect the different specificities of the
policymaking needs of the six parent DGs. There were concerns that otherwise, the
mapping of a long-list of options was too generic an approach to be effective without
further customisation. However, REA pointed out that in order to address the different
policymaking requirements of different parent DGs, and to customise these to meet
identified policymaking needs, there is a need for the Commission services to strengthen
their involvement in the process initiated by the Agency to better formulate their policy
feedback needs. As these needs are often insufficiently established and therefore unclear,
it leads to an expectations gap between what different DGs were expecting in terms of
policy feedback, and the type and format of knowledge outputs feeding into policymaking
that they were actually receiving.

There are evidently also limitations in terms of how evaluable some of the policy
feedback options developed by REA are, since, while many different options have been
identified in the catalogue of PO, only some have so far been implemented in the 2015-
2018 period of operations, and some activities were only implemented towards the end
of the evaluation time-scope.

REA has also implemented specific measures to strengthen the timeliness of inputs to
support policymaking, such as the AGILE-Rapid Reaction Network, which is designed
to enhance flexibility by providing a single knowledge management and information
201
channel through which units within parent DGs can request prompt input. Further
information about this Network is provided in the mini case study example below:

A case study on the AGILE-Rapid Reaction Network

Mechanism to facilitate knowledge-sharing and policy feedback: The AGILE-


Rapid Reaction Network.

Aims: The Network is designed to provide a single channel for knowledge-sharing


based on the Policy Feedback approach. A further aim was for the Network to provide
data on cross-cutting research activities and new and emerging topics of EU
importance, but that are not very well-developed.

Description: The network is a relatively new, demand-driven initiative which was


requested by the Commission services in 2017. Though a single channel for
knowledge-sharing, the network allows for a screening of the full REA portfolio to
identify relevant projects being supported within the thematic portfolio relevant to their
policy unit. Feedback is then provided in the form of a factsheet within 2-3 days. The
tool has been administered by REA Networks who are responsible for implementing an
action plan, which includes information tools to be developed to strengthen REA’s
inputs to policymaking.

Part of the rationale was that such a tool could address urgent requests for policy
feedback in new and emerging areas of scientific research, as well as on research
topics being supported of a cross-cutting nature, where the contribution of the FPs,
when spread across many projects across diverse programmes, may previously have
been difficult to identify.

There are examples of research into topics which fall outside the scope of a specific
programme being managed by REA, but where the Agency can add value by checking
across their programme portfolio under which different sub-programmes projects may
have been funded that could have contribute to research topics cutting across different
parts of the programme. Examples provided were: research into the oceans,
nanomaterials, cancer and batteries.

Coherence and effectiveness of the tool: The Network was viewed as a useful tool
to facilitate knowledge-sharing by some interviewees. These are much more than
internal information and knowledge exchange networks, since they sometimes feed not
only into information sharing through networking by REA with its parent DGs, but also
the wider Research family. The initiative was regarded as a good practice example of
close collaboration between units implementing part of a programme within REA and
parent DGs through the involvement of mirror units.

The Network was viewed by REA management and some officials from parent DGs as
having the potential to help REA fulfil its expanded role and responsibilities under the
Policy Feedback function. This tool should help to maximise the alignment of scientific
research results to policymaking needs by creating new opportunities for interaction
between those undertaking research, REA implementing the sub-programmes and
monitoring the achievements of project results and policymakers.

Lessons learned: According to interviewees from REA, the implementation of this


Network to date has proved both an organisational and practical challenge. There is a
need for REA to ensure that it addresses the human resource and technical capacity
challenges necessary to be able to operate this relatively new initiative effectively. This
suggests a need for capacity-building support to ensure that the network is able to
respond to requests sufficiently rapidly.
Source: compiled by CSES.

Taken overall, the strategic and more operational initiatives put in place appear to have
been well-designed in tackling shortcomings identified in the earlier evaluation. However,
it is too early to provide a full assessment of some initiatives launched by REA designed
to strengthen policy feedback, since these were only implemented relatively recently.
202
Nevertheless, it is worth stressing the feedback from EC officials through interviews and
Survey C which was that while strong progress is being made, further customisation of
policy outputs to meet the differing needs of policymakers is needed.

Achievements and impacts to date and outstanding challenges

Despite the fact that most policy options have not been tested during at least one full
programme implementation cycle, the interview programme and Survey C with
Commission Officials has provided important indications about the impact of the
measures implemented to date. For example, despite the significant potential of REA`s
PFF, EC Officials shared some concerns (interviews, Survey C) regarding the
effectiveness of the practical implementation of some measures designed to strengthen
policymaking, in particular regarding the manner in which relevant information is shaped
by REA Officials. This relates to the lack of specialisation of human resources, in
particular, the lack of sufficient training for some REA staff as to how to contribute to
enhancing policymaking.

According to some interviewees from parent DGs, there was a perceived lack of know-
how among some Agency staff as to how to select the most relevant data and
information about projects being implemented through programmes within H2020 under
REA management and how to extract the most policy-relevant research results. This was
also perceived by some interviewees as negatively influencing the manner in which the
staff concerned develop the preliminary synthesis analysis of research results and the
process of identifying policy-relevant feedback. It was suggested by Commission officials
from parent DGs that this challenge could be overcome by further training.

Among the suggestions made as to measures that could be taken in future that would be
especially pertinent from a parent DG perspective were: improving the access of Parent
DGs to the projects managed by REA, encouraging projects to have an interface with
policymaking DGs; establishing shared folders/common IT platform with project-related
information. Some parent DGs expressed the view that more direct contact with
beneficiaries would improve their knowledge about programme implementation and the
results being achieved through project-level research activities. Suggestions were also
made regarding the volume of reports produced by the Agency. Some Commission
officials interviewed (confirmed in the findings from Survey C) believed that outputs to
inform policymaking could be strengthened by increasing the level of detail and through
further formulation of policy-relevant conclusions extracted.

However, there were contradictions in the feedback received. Whereas some Commission
officials would like to receive more information, others commented that due to time and
workload pressures, they were unable to use the existing policy-relevant information
provided sufficiently. The interviews conducted identified the issue of a general increase
in the Commission`s workload as a problem that has limited take-up of some REA
knowledge outputs meant to contribute to strengthening policymaking. This limited take-
up was therefore seen as a lack of human resource capacity to process policy-relevant
information produced by the Agency. This demonstrated that additional human resource
and technical capacity is needed not only within the Agency, but also across the
Commission’s parent DGs to achieve optimal provision of policy-relevant information and
knowledge exchange between the Agency and its parent DGs.

While some progress has been made in improving the effectiveness of the PFF, the
implementation of activities being supported within this function appears to be still very
much in the process of development. Survey C with Commission Officials also highlighted
that during the initial few years of the implementation of the different types of policy
reporting mechanisms listed in the Catalogue of Options, REA was making use of these
policy feedback options to a differing extent. Whereas some PO were seen as having
been utilised extensively, others have not been used at all. This was seen by some
interviewees in parent DGs as reflecting the absence of a sufficiently systematic approach
regarding the Agency’s implementation of the Policy Feedback Function. However, REA
203
pointed out that the Catalogue of Options provides a list in the form of a typology, which
was intended to be used by EC officials as a suite of options. In addition, the level of
satisfaction for the different policy feedback options and reporting tools to disseminate
policy-relevant information varied significantly. Moreover, the findings suggest that some
policy reporting options were more efficient than others, and potentially more
appropriate for promoting improvements in structured dialogue between the Agency and
the Commission in the future.

The survey results also highlighted that while some new working methods and practices
that could be beneficial in strengthening the effectiveness of policy feedback, have been
developed, they are not yet effective since they are not being implemented
systematically. For example, the opportunity for Commission officials to directly retrieve
policy-relevant messages through direct invitations to REA’s meetings is such an
example127. In total, 50 % of Commission officials reported that they did not take part in
the consensus meetings organised by REA for the Calls or Research topics under their
responsibility, 20 % took part in these meetings and 30 % stated “don’t know,” which is
an indication that some Commission officials are not engaging sufficiently to be able to
derive policy-relevant information through a more proactive engagement approach
(rather than simply relying on REA to provide them with information which they do not
always use).

Conclusions

Overall, significant progress has been made by REA in strengthening policy feedback in
2015-2018 compared with the 2012-2015 period. Several steps have been undertaken
during the evaluation period towards well-established, multi-dimensional and continuous
cooperation in policy feedback and knowledge exchange between REA and parent DGs
through the creation of PFTF Task Force. Among the developments which have helped to
strengthen the provision of feedback to policymakers about policy-relevant aspects of
programme implementation was the adoption of a Catalogue of Policy Options, an
amendment to the MoU establishing a set of procedures for ensuring more structured
dialogue, introducing policy feedback as a key topic in the quarterly coordination
meetings between REA and the relevant Commission counterparts. In addition, other
concrete initiatives were undertaken by REA to strengthen policy feedback, such as the
organisation of thematic cluster meetings to derive policy-relevant results and the
creation of the RAPID reaction network.

Despite the positive progress made overall, there was less positive feedback from some
Commission officials. Some DGs perceived that greater attention was being given to
informing policy development in some policy areas than in others. The fact that not all
policy options identified in the Policy Options Catalogue have been equally effective or
applied with similar frequency to date was also highlighted by some officials. Some
challenges identified in the previous evaluation have been difficult to address to date.
REA was not originally provided with financial resources or staffed to perform policy
feedback activities. However, an expectation has emerged that REA will increasingly be
required to extract relevant policy information from research results both in the current
and future programming period. While, the next CBA will need to factor this in, it remains
unclear how much resource will be made available for policy feedback. A concern of REA
is that they may need to provide much more extensive policy feedback for the same
resources by making significant internal efficiency gains and it is not yet known at this
stage in the programme planning cycle whether such gains could be realistic. A further
challenge is the ongoing mismatch between the Commission’s expectations on policy
feedback and the availability of information about policy-relevant research results. This
should be a problem that can be addressed in future, since REA has only started working
on this area in the 2014-2020 period. Looking ahead, however, the differing needs and

127
However, it should be noted that it has recently been clarified in discussions with DG RTD and DG AGRI that
the participation of policy officers in consensus discussions will only be expected when they concern the actions
with policy interest.
204
expectations of each DG should be established by the relevant DGs, with the involvement
of REA.

Looking to the remainder of H2020 and beyond to a possible successor programme, it


could be helpful both for REA and its parent DGs to have a clearer definition of REA’s role
and tasks relating to informing policy development through the extraction of policy-
relevant information and research results in the MoU. In addition, REA has signalled its
intention to include a written agreement for each programme on what form policy
feedback should take (using the Policy Options Catalogue as the starting point for the
agreed basis on which such feedback would be provided), and on which delivery channels
should be used for disseminating such information. This could be attached to REA’s
annual management plan. It is expected that this will evolve over time. Manual updates
of any changes to the types of policy outputs required by DGs could be made by REA as
and when these are agreed between REA and its parent DGs on a bilateral basis.

205
Assessment of management and provision of central support services

This in-depth study area provides the analysis of REA’s management and provision of
central support services. In particular, this study area aims to analyse to what extent the
Agency was effective and efficient in providing central support services, and how satisfied
the Agency’s key stakeholders were. Below we present our key findings on the
assessment of the centralised FP7 and Horizon 2020 services that REA was in charge of
during the evaluation period of the evaluation. To analyse the Agency’s practices, we
relied on desk research as well as the feedback gathered throughout the interview and
survey programmes. The findings of our analysis are structured according to the key
centralised support services provided by REA.

Registration process

According to the survey of REA’s independent experts, almost 88 % of survey


respondents strongly or rather agreed that the registration process was smooth, and this
number increased since 2014, where 83 % of survey respondents strongly or rather
agreed with this statement. The eligibility requirements and the ease of finding
information on how to become an independent expert were also assessed positively. On
the other hand, the findings were less positive in terms of the transparency of the
selection procedures. Nearly 57 % assessed this aspect of the registration process
positively, while over 20 % of survey respondents strongly or rather disagreed with the
related statement. Further analysis of the survey results revealed that the
recommendation by colleagues or superiors in one’s institution (37 %), EC website
(27 %), participation in the research project supported by FP7/H2020 (26 %) and
relevant national sources (21 %) were key sources of information about the opportunity
to become an independent expert of REA.

3%
Overall, the registration process was
smooth (N=2339)
44% 44% 8%
2% 1%
Eligibility requirements to become an
expert were balanced (N=2221)
40% 46% 12%
6% 1%
Information on how to be come an
independent expert was easy to find…
36% 44% 12%
2%
These selection procedures were
transparent (N=2152)
28% 29% 22% 14% 7%

0% 10% 20% 30% 40% 50% 60% 70% 80% 90% 100%

Strongly agree Rather agree Neither agree nor disagree Rather disagree Strongly disagree

Figure 39. The extent to which expert evaluators agreed with the related statements on the registration process.
Source: survey of REA’s external experts.

Before experts could be contracted, their identities and bank account details have to be
verified via REA‘s Legal Entity/Bank Account Validation service. This procedure is carried
out separately from the registration process. Survey data show that the majority of
experts (typically between 86-95 % depending on the specific question asked) had a
positive experience and were satisfied with the support provided and quality of the
services received. This number increased since 2014, where around 80-85 % of survey
respondents were satisfied with the related statements. A total of 71 % of experts also
strongly or rather agreed that the email notification that they received asking to provide
one’s identity and bank account details was clear.

Selection & contracting of experts

206
Regarding selection and contracting of experts, almost 94 % of survey respondents were
contacted with regard to their availability to work as an independent expert for a
particular evaluation or project before they received an expert contract to sign. Overall,
the selection and contracting process was assessed very positively, as was the issue time
of the contract (about 90 % of survey respondents strongly or rather agreed with these
statements). In comparison, in 2014, 85 % of survey respondents strongly or rather
agreed with the statements.

1%
2%
The email notification informing me that a contract has
been sent to me by the Commission was clear (N=2373)
76% 21%
2% 1%
The contract sent to me by the Commission was easy to
access and sign (N=2380)
76% 20%
5% 2%
Overall, the selection and contracting process was
smooth (N=2372)
66% 27%
7%
6% 1%
The contract was issued in sufficient time for me to
organise my schedule /work (N=2381)
54% 33%

0% 10% 20% 30% 40% 50% 60% 70% 80% 90% 100%

Strongly agree Rather agree Neither agree nor disagree Rather disagree Strongly disagree

Figure 40. The extent to which expert evaluators agreed with the related statements on the selection and contracting process.
Source: survey of REA’s external experts.

REA’s assistance to the experts during performance of the assignments and tasks

Figure 41 shows a detailed breakdown of the experts’ feedback on the different aspects
of their work. Although the feedback was largely positive, the responses of experts
showed lower satisfaction in aspects related to the online evaluation system used, the
quality of the templates used for the reports, as well as the time available for drafting the
reports128. It should be noted that these issues are not fully under the control of REA as it
relies on corporate tools and templates provided by DIGIT/CSC or timing that results
from the TTG targets.

128
It should be noted that some of these aspects the online evaluation system and the quality of templates
used for the reports are agreed on centrally and are not under the control of REA. Hence, this should not be
linked to REA’s performance.
207
The REA staff with whom I worked with were responsive
(e.g. by email or phone) (N=2249)
96%
The REA staff with whom I worked with provided useful
answers to my questions (N=2237)
95%
Information provided by REA was clear and sufficient
(N=2364)
94%
Tasks I had to carry out were clearly stated in the contract
(N=2358)
93%
It was clear to me how to evaluate and rate
proposals/monitor project activities (2362)
92%
Overall, the evaluation/monitoring process was smooth
(N=2354)
91%
I knew either who to contact or where to get help
regarding any questions I had when working on my tasks…
91%
The electronic evaluation system was easy to access
(N=2317)
91%
I was appropriately briefed on the requirements for my
work (N=2274)
89%
The electronic evaluation system was easy to use
(N=2316)
83%
The templates for the reports I had to complete were fit
for purpose (N=2321)
81%
Time available for drafting the reports was appropriate
(N=2354)
60%

0% 20% 40% 60% 80% 100%

Figure 41. Share of expert evaluators who strongly or rather agreed with the related statements on the performance of their
assignments and tasks.
Source: survey of REA’s external experts.

Payments to experts

In 2014, for example, only 58 % of the payments were made within 30 days. In order to
quickly address the situation and avoid additional delays, staff from other units in REA
and interim staff were strengthening the team already in place. In the last quarter of
2014, the backlog was cleared and the TTP performance returned back to its usual high-
quality standards. In 2015, 18 800 payment files have been validated with an average
TTP of 15 days. Around 94 % of the payments were made within the 30-day contractual
limit. The number of payments in 2017 increased to 26 041. An average TTP in 2016–
2017 increased to 11 days and approximately 99 % of payments were made within the
30-day limit.

Table 28. Execution of payments to experts during 2014-2017.

Expenditure Number of Average Average Share of expert


type payments time-to-pay time-to-pay payments made on
made (net) (gross) time (target = 30
days)

Experts - 2014 10 585 29.95 30.67 58 %

Experts – 2015 18 800 15.0 15.7 96 %

Experts – 2016 25 777 11.0 11.6 99 %

Experts – 2017 26 041 10.5 11.0 99 %

Experts – 1st 2 172 11.0 N/A 98 %


semester 2018

208
Source: REA’s Annual Activity Reports 2014-2017.

Despite the delays experienced in 2014, the time it took REA to process the payment
requests and make payments was assessed positively by a majority of the experts (i.e.
4 % of the surveyed independent experts strongly or rather agreed with the related
statement), while approximately 90 % of the surveyed independent experts were,
overall, satisfied with the way that their payments were handled. The 2018 survey
revealed that the number of independent experts, who strongly or rather agreed with this
statement, increased to 92 %.

In 2014, nearly half of the surveyed experts (i.e. 44 %), however, disagreed that the
number of days they were paid for their remote work matched the effort they actually
spent on their tasks. However, in the 2018 survey this number decreased to 23 %. The
additional feedback to an open-ended survey question revealed that a significant share of
the experts felt they had spent more time on their tasks than what was compensated for
and that their daily allowance is too low. This feedback may relate to experiences in
working as an expert before REA revised the task allocation matrix where more time was
allocated to individual tasks. More detailed analysis of the survey results, however, did
not produce evidence which would suggest that this affected the experts’ overall
satisfaction with the Agency’s services and willingness to work with REA in the future (i.e.
nearly 91 % of the experts would certainly, and another 8 % possibly, want to work with
the Agency in the future).

4% 2%
It was clear which travel related expenses
were not eligible for reimbursement 71% 23%
(N=1690) 2%
3%
The time it took REA to process my
payment requests and make payments 69% 25%
was satisfactory (N=2323)
6% 1%
Overall, I am satisfied with the way that
my payment was handled (N=2331)
61% 31%
2%
The payment I received matched the
effort I had spent on my tasks (N=2330)
35% 29% 13% 16% 7%

0% 10% 20% 30% 40% 50% 60% 70% 80% 90% 100%

Strongly agree Rather agree Neither agree nor disagree Rather disagree Strongly disagree

Figure 42. The extent to which expert evaluators agreed with the related statements on the payments made by REA.
Source: survey of REA’s external experts.

Research Enquiry Service

Review of the 2014-2017 AARs suggests that REA’s Research Enquiry Service (RES)
handled the queries effectively and answered the majority of questions within the
timeframe set. In 2014, the Research Enquiry Service received and responded to 13 000
new enquiries, of which 94 % were answered within 15 working days. Under the previous
framework programme, the average of enquiries was about 5 000-6 000 per year. In
2015, the number of requests decreased slightly to nearly 10 700 and the number of
enquiries handled within 15 working days increased to 96 %. The number of enquiries for
RES in 2016 was similar to 2015, however, the number of questions answered within 15
working days decreased to 89 %. In 2017, RES responded to 7 900 requests for
information and in January–June 2018, the number of questions for RES reached 4 100.

209
Table 29. Handling of requests to the enquiry service

Target 2014 results 2015 results 2016 2017 Mid-


results results 2018
results

Enquiries received 13 000 10 700 11 000 7 900 4 100

Direct questions through 100 % within 8 100 % within 8 96 % 99.4 % 99.8 %


the enquiry service working days working days within 3
answered or forwarded (99 % within 3 (100 % within 3 working
within 8 days, with an working days) working days) days
average of 3 working days

Percentage of questions 94 % 96 % 89 % 94 % 92.5 %


answered within 15
working days

Note: with the new contract the way of calculating the SLA was modified affecting statistical data as of 1/2/2016
Source: compiled by PPMI based on REA’s AARs 2014-2017.

According to the survey of REA’s beneficiaries, the Research Enquiry Service provided
high-quality services. Around 93 % of the beneficiaries that used the services found the
responses received very helpful or helpful. On the other hand, while REA provided a high-
quality service, the survey responses imply that a significant part of REA’s beneficiaries
(51 %) were unaware of the possibility to use the service. This number increased since
2014, when 44 % of REA’s beneficiaries were unaware of the service.

When preparing your application or To what extent was the response from
thereafer, did you contact the these services helpful?
Research Enquiry Service? 60% 54%
60% 51% 50%
50% 39%
40%
40%
29%
30% 30%
20% 13% 20%
8%
10%
10% 3%
0% 2% 1%
Yes, once Yes, more Never, even Never, 0%
than once though I because I Very Helpful Neither Not very Not
was aware was not helpful helpful helpful helpful at
of this aware of nor all
service this service unhelpful

Figure 43. Beneficiaries‘ satisfaction with the services of REA’s Research Enquiry Service.
Source: survey of REA’s beneficiaries (the graph on the left is based on 477 valid responses and the graph on the right is based
on 94 valid responses).

REA’s Validation Services (including the implementation of SEDIA)

The overall performance of the Agency was highly favourable and target deadlines for
validations were fully respected according to the AARs. Moreover, the degree of client
satisfaction was positively assessed throughout 2015-2018 despite the challenges
associated with the significant extension of the Agency’s mandate in the area of
validation services. Since REA has been delegated a major role in the development and
roll-out of the SEDIA project from 1 January 2018, the Agency handles validations not
only for Horizon 2020 and several other programmes but also for all grant and

210
procurement activities under direct management and first level of indirect
management129.

As mentioned in the main report, REA’s mandate was extended as of 1 January 2018 to
the participant validation beyond H2020, e.g. by including operations of the Commission
services and other EU bodies (grants and procurement). This one-stop shop for legal and
financial data provides an important simplification for clients and participants as REA
centralises all of the following services:

 validation of legal information relating to the participants in all grant and procurement
procedures implemented by Commission services, Executive Agencies and Joint
Undertakings under direct management and first level of indirect management;

 validation of the LEAR appointed by the participant on the basis of relevant supporting
documents;

 sample-based verification on participants to confirm their SME status since 2015 and
validation, if requested, of participants in the SME Instrument;

 Preparation of the financial capacity assessment of participants in operations managed


by SEDIA clients, in line with the legal provisions and the specific requirements of their
programmes.

As a result, a series of new clients operating grant programmes and procurement actions
under direct management have been gradually transferred to REA throughout 2018.
Although the quality of services provided through the SEDIA participant validation service
cannot yet be completely assessed in this evaluation due to the insufficiency of data, the
initial observations of EC officials suggested that REA effectively implemented the new
tasks. For more details on the assessment of the preparation process initiated by the
Agency to accommodate the implementation of SEDIA, please refer to the in-depth study
area covering REA’s key success stories and lessons learned.

Legal validation of participants

The Participant Register is used as a single entry point for participants to submit legal
and financial information. REA ensures that registrations of legal entities are validated in
a timely fashion against the provided supporting documents. The table below lists the
indicators used to evaluate the performance of the legal validation function since 2014. It
can be noted that some indicators were introduced in the 2014 and 2015. Other targets
have been added in compliance with the SEDIA scope (e.g. in the AWP 2018).

Table 30. Targets related to time to validate URF validation requests, from 2014 to mid-
2018.

Target Result

2018 (situation on 30 June 2018)

To perform 95 % of validations within the 90 days from the “raise priority” date (for all 96 %
direct management operations at the EC)

2017

To perform 95 % of validations within the 90 days from the “raise priority” date 95.1 %

129
This is based on the feedback received from REA at the beginning of December 2018.
211
2016

To perform 95 % of validations within the 90 days from the “raise priority” date 92.4 %

2015

To perform a duplicate search and contact 100 % participants within 10 working days 100 %
from the moment participants linked to a call appear in PDM with the right priority

To validate participants within 10 working days after the submission of all documents Not available

2014

To perform a duplicate search and contact 100 % of participants within 10 working days 100 %
from the moment participants linked to a call appear in PDM with the right priority

To validate participants within 10 working days after the submission of all documents Not
available130

Source: compiled by PPMI based on AWPs and AARs.

A video explaining the validation process under SEDIA has been produced by REA and it
is available for all stakeholders in the Funding & Tenders Portal.

Preparation of Financial Capacity Assessment (FCA)

In accordance with the Financial Regulation applicable to the budget of the European
Union the decision on the financial capacity of the participants belongs to the Authorising
Officer managing the grant or procurement procedure. The mandate of REA as service
provider is therefore established as centralising financial data from selected beneficiaries
and verifying summary financial information against supporting documents. The result of
REA’s analysis is stored and visible to its client in the PDM tool which provides verified
and standardised financial information.

The available evidence for the analysed period suggests that the related services were
handled effectively and no major issues linked to the service delivery were mentioned in
the AARs (see Part 2 for more details). For the period 2014 to 2016 REA handled from
1 600 to 2 000 FCA requests per year. As from 2017, the reporting on assessments
provided to clients changed – only those cases in which financial data had been provided
to REA clients were taken into account. In prior years, all automatic requests triggered by
the IT tools were taken into account, even if some were subsequently abandoned after
analysis because they were not needed by the respective Authorising Officers. This is
why 1 300 FCA were dropped in 2017. During the first half of 2018 and due to the
gradual integration of new clients the delivered FCAs amount to 860.

Table 31. Targets related to support for financial capacity assessments, from 2014 to
mid-2018.

Target 2014 2015 2016 2017 Mid-2018


results results results results results

Support for financial capacity 1 600 2 000 1 850 1 300 860


assessments (rounded)

Source: based on data provided by REA.

A new Indicator introduced in the REA AWP 2014. The performance could not be measured with the current
130

monitoring capacities of PDM.


212
In view of the provision of corporate financial validation services REA carried out a study
on the extent and the nature of the financial information needed by its new SEDIA
clients. The aim of the study was to ensure that REA would be able to provide the same
financial information as used by these new clients before joining the corporate solution.
An agreement was reached before the start of SEDIA on the common core financial data
to be provided by REA to all clients which further enhances the harmonisation process.

Another step in the harmonisation process was the agreement reached in the Financial
Services’ network of the Commission (RUF) on the way participants in grant procedures
are selected for financial capacity assessment. As the Authorising Officers managing the
grant procedures have the flexibility to select which participants should be checked the
selection approach may vary from coordinators only (in certain programmes) to all
beneficiaries in the grant (for others). This goes against the principle of equal treatment
of applicants and could be questioned from the cost efficiency perspective. The
agreement reached in the RUF to limit the possibilities open to the Authorising Officers to
two options will reach a twofold objective: facilitate REA’s processes and optimise IT tools
on one side and reduce the burden on participants on the other.

As a future improvement, REA envisages assisting the different SEDIA clients in possible
harmonisation of the financial indicators used in the different grant programmes, as well
as in procurement procedures.

Conclusions

Overall, REA delivered a high quality and effective service to its clients and other
stakeholders in the area of the management and provision of central support services,
achieving and exceeding the KPIs set. As regards the supervision of proposal evaluation
activities and the management of central support services, the Agency coped well with
the increased workload following the expansion of its mandate in 2017. The increased
level of workload was primarily associated with important work carried out in facilitating
the on-boarding of the services in preparation for SEDIA implementation131.

The Agency improved its performance with respect to time-to-pay for expert evaluators
during 2015-2018 compared to the previous evaluation period. The feedback received
from the independent experts contracted by the Agency suggests they were highly
satisfied with the service provided and almost all experts (i.e. 99 %) were certainly or
potentially willing to work with REA in the future. Although the overall processes of
registration, selection and contracting of experts were assessed more favourably
compared to the previous evaluation period, the levels of satisfaction remained less
positive in several areas. For instance, they were less positive about the online
evaluation system used, quality of the templates used for the reports, and the time
available for drafting the reports132.

Concerning the management of the Research Enquiry Service, Financial Capacity Checks
and the participant Validation Services, the overall performance of the Agency was
favourable and target deadlines for validations were fully respected between 2015 and
2018.

131
Taking into account the level of advancement of the project, the evaluation could not assess the
implementation of SEDIA, thus it remains to be performed at a later stage.
132
These issues are not fully under the control of REA as it relies on corporate tools and templates provided by
DIGIT/CSC or timing that results from the TTG targets.
213
Assessment of the newly introduced key business processes & efficiency gains achieved

This study area explores the newly introduced key business process and efficiency gains
achieved. The main objective of this case study is to assess whether changes to the key
business processes introduced by REA for the evaluation of proposals, preparation of
grant agreements and grant management led to an improved performance of the Agency
in terms of its KPIs and satisfaction of beneficiaries. Progress made in the 2015-2018
period of operations is then compared with the situation in the previous 2012-2015
period of operations. The study employed a combination of desk research and interviews
with REA management and the parent DGs. In addition, the results from survey A were
taken into account.

Evolution of the performance of REA in terms of its KPIs and satisfaction of


beneficiaries

During the 2015-2018 period, REA in cooperation with the Commission continued the
optimisation of its procedures and programme management functions and introduced a
number of simplifications. The improvements and simplifications concerned wider use of
IT tools and realisation of electronic workflows, wider use of remote evaluation of
proposals, improved procedures for allocating proposals to the most suitable experts,
measures to automate and improve the detection of possible conflicts of interest, etc.
REA’s Networks played an important role in the learning and development process and in
the corresponding optimisation and simplification of REA’s key business processes.

It is also important to note that the current implementation of H2020 programmes


delegated to REA benefits from the overall simplification of H2020 implementation
modalities133, such as simpler management of projects (a single funding rate for all
beneficiaries and all activities covered by a grant, no time-sheets for personnel working
full time on EU projects, acceptance of average personnel costs and beneficiaries’ usual
accounting practices for direct costs, etc.), the no-negotiation approach following
proposal evaluation phase, the limitation of the mandatory ex ante financial capacity
checks, simplified validation of legal entities, etc.

The analysis showed that all these simplifications and optimisations of the procedures,
processes and delivery tools bundled with the growing maturity of the Agency and
striving for learning and development, had a significant impact on the performance of the
Agency in reaching its KPIs.

For example, REA’s average time-to-grant (TTG) decreased from 351 days in 2010 to
222 days in 2014 and to 193 days during 2016-2018. Further, the share of grants
concluded within the TTG target rose to 99 % for 2015 calls and to nearly 100 % for
2016-2017 calls, whereas grants concluded exceeding TTG limits mostly related to
extensions granted to the beneficiaries on their request or to specific circumstances
(such as security scrutiny, very complex ethic review, etc.).

The Agency also demonstrated an improved performance in terms of time-to-pay (TTP),


where performance for H2020 payments was generally higher compared to FP7,
especially with regard to H2020 gross TTP 134, which was facilitated by the fully electronic
H2020 grant management IT tools.

Similarly, during 2015-2018, REA significantly improved its performance in terms of


timely processing of requests for grant amendments submitted by beneficiaries. Time-to-

133
Compared to FP7.
134
Gross TTP measures the actual time elapsed between the submission of the payment claim and the transfer
of funds by the Agency and shows the payment processing time from the beneficiary’s perspective. Gross TTP
could be significantly longer than net TTP, as delays by beneficiaries in providing additional information in
response to requests from REA for additional documents or clarifications are discounted from the net TTP (the
‘stop-the-clock’ mechanism is applied).
214
amend (TTA) performance for H2020 grants was significantly better compared to FP7,
especially in terms of gross TTA, which is more important from the beneficiaries’
perspective (the average gross TTA for H2020 was around 3 times lower compared to
FP7 in 2015-2017).

All these developments resulted in an improved satisfaction of beneficiaries with the


performance of REA in relation to the timeliness of evaluation, contracting and grant
management processes (please see Figure 44 below).

The time period from the call deadline to the time the 5%
outcome of the proposal was announced to you (i.e. 36% 48% 11% 1%
time-to-inform) was appropriate
28% 49% 13% 1%
9%
The time period from the announcement of your 6%
proposal’s outcome to the time you signed the 35% 49% 8% 2%
contract (i.e. time-to-contract) was appropriate
31% 45% 12% 3%

9%
The overall time period from submission of the 7%
proposal to signature of the grant agreement (i.e. 32% 48% 12% 2%
time-to-grant) was appropriate
26% 46% 15% 3%
9%
4%
For interim payments: the time it took the Agency to
process payment requests and make payments was 41% 48% 7% 1%
appropriate
36% 43% 13% 2%
6%
For the final payment: the time it took the Agency to
process the payment request and make payment was 44% 45% 8% 3%
appropriate
27% 37% 24% 4%
8%
11%
For grant amendments: the time it took the Agency to
process grant amendment requests was appropriate
38% 30% 13% 8%
7%
25% 39% 23% 6%

0% 10% 20% 30% 40% 50% 60% 70% 80% 90% 100%

Strongly agree Rather agree Neither agree nor disagree Rather disagree Strongly disagree

Figure 44. Satisfaction of beneficiaries with the performance of REA in relation to timeliness of the evaluation, contracting and
grant management processes.
Note: The first line indicates the results of the 2018 survey, second – 2015 survey. Source: 2018 and 2015 surveys of REA’s
beneficiaries.

Moreover, improved performance of the Agency also had a significant impact on the
qualitative aspects of programme delivery (e.g. the share of evaluation review/redress
cases filed by applicants decreased from 3 % in 2011 to 1.2 % in 2017, while the share
of upheld cases decreased from 0.8 % in 2011 to 0.3 % in 2017) and led to an increased
satisfaction of beneficiaries with various aspects related to application, contracting and
grant management processes (see Figure 45 below).

215
5%
Information for applicants was easy to find 38% 49% 6% 1%

37% 48% 10% 1%


4%
5%
Information for applicants was clear 36% 49% 9% 1%

33% 51% 10% 1%


5%
8%
The requirements for application process were
reasonable and proportionate
26% 47% 14% 4%

24% 45% 18% 3%


9% 4%
The evaluation process was transparent 36% 44% 13% 2%

30% 42% 19% 3%


6%
6%
Requests from REA (e.g. for proposal modification or
providing missing information) were clear
40% 40% 11% 3%

39% 40% 14% 1%


6%

The feedback I received on the progress with the 5%


content in my project (e.g. in a mid-term review) was 48% 27% 17% 4%
useful
35% 38% 19% 3%
5%
6%
Technical reporting requirements were clear (2018
Survey)
37% 42% 13% 2%
7%
Financial reporting requirements were clear (2018
Survey)
36% 42% 13% 2%
Project reporting requirements were clear (2015
Survey)
32% 44% 14% 3%
7%
7%
The periodic reporting requirements were appropriate
to the level of activities in my project (2018 Survey)
36% 45% 9% 3%
Project reporting requirements were reasonable and
proportionate (2015 Survey)
33% 44% 16% 2%
5%
6%
The process of monitoring my project by REA was
smooth (2018 Survey)
39% 37% 13% 6%
The process of monitoring my project by REA was 3%
46% 32% 16% 3%
transparent (2018 Survey)
The process of monitoring our project was clear and
transparent (2015 Survey)
35% 36% 19% 2%
9%
0% 10% 20% 30% 40% 50% 60% 70% 80% 90% 100%

Strongly agree Rather agree Neither agree nor disagree Rather disagree Strongly disagree

Figure 45. Satisfaction of beneficiaries with the various aspects of application, contracting and grant management processes.
Note: Unless noted otherwise, the first line indicates the results of the 2018 survey, second – 2015 survey. Source: 2018 and
2015 surveys of REA’s beneficiaries.

The beneficiaries’ survey also revealed that the level of satisfaction related to the user-
friendliness of the IT tools employed for the application and grant management was

216
relatively low; this level of satisfaction has not improved compared to the previous
evaluation period (see Figure 46 below).

3%
The electronic tool used for submitting the application
was user-friendly
27% 44% 16% 11%

29% 42% 16% 9%


5%
4%
The electronic tools used in the
negotiation/contracting process were user-friendly
24% 42% 20% 9%

24% 41% 20% 11%


4%
8%
The electronic tools used for managing my grant were
user-friendly
27% 35% 16% 14%

25% 42% 20% 9%


5%
0% 10% 20% 30% 40% 50% 60% 70% 80% 90% 100%

Strongly agree Rather agree Neither agree nor disagree Rather disagree Strongly disagree

Figure 46. Satisfaction of beneficiaries with the IT tools used for application, contracting and grant management processes.
Note: The first line indicates the results of the 2018 survey, second – 2015 survey. Source: 2018 and 2015 surveys of REA’s
beneficiaries.

It is important to note that IT tools play an increasingly important role in programme and
grant management. During the evaluation period these tools became more complex and
embedded more processes, allowing paperless workflows, etc. As already indicated in this
case study and our evaluation report, the analysis showed that the development of the IT
tools contributed to an improved delivery of the programme management functions
(improved TTG, TTP, TTA, etc.) and a growing beneficiaries’ level of satisfaction related
to most processes during the grant life-cycle (application, contracting and grant
management). On the other side, the demands for IT tools and their user-friendliness
also grew, which could explain, at least to some extent, the relatively low level of
satisfaction with the user-friendliness of the IT tools among beneficiaries. Thus, further
efforts are needed to improve the user-friendliness and user-experience in relation to
using corporate IT tools for programme management purposes.

Conclusions

During the 2015-2018 period, REA in cooperation with the Commission continued the
optimisation of its procedures and programme management functions and introduced a
number of simplifications. The analysis showed that all these simplifications and
optimisation of the procedures, processes and delivery tools taken together with the
growing maturity of the Agency and striving for learning and development, had a
significant impact on the performance of the Agency in reaching its KPIs and growing
satisfaction of beneficiaries with REA’s performance during various stages of the project
life-cycle.

Similar to the 2015 beneficiaries’ survey, the lowest levels of satisfaction of beneficiaries
in 2018 were consistently related to the user-friendliness of the IT tools employed
throughout the project life-cycle135. It is important to note that IT tools play an
increasingly important role in programme and grant management. During the evaluation
period these tools became more complex and embedded more processes, allowed
paperless workflows, etc. As noted above, our analysis showed that the development of

135
Which for H2020 are developed centrally by the Commission’s Common Support Centre.
217
the IT tools contributed to an improved delivery of the programme management
functions (improved KPIs, etc.) and a growing beneficiaries’ level of satisfaction related
to most processes of the grant life-cycle. On the other side, the demands for IT tools and
their user-friendliness also grew, which could explain, at least to some extent, the
relatively low level of satisfaction with the user-friendliness of the IT tools among
beneficiaries. Thus, further efforts are needed to improve the user-friendliness and user-
experience of the corporate IT tools related to programme management.

218
Assessment of REA’s key success stories and lessons learned during 2015-2018

This in-depth study area provides an evidence-based analysis of REA’s key success
stories and lessons learned during 2015-2018. As noted in section 4.1.1., the Agency
respected its legal framework in 2015-2018 and successfully accommodated key changes
and challenges, which were effectively and smoothly introduced during the evaluation
period (particularly in relation to REA’s extended mandate in 2014 and 2017). In view of
the emerging changes and needs, the Agency undertook a number of measures to
flexibly adjust its internal operations and procedures. Relying on desk research as well as
the feedback gathered throughout the interview and survey programmes, we identified a
number of topics and areas of focus where specific actions/simplifications were
introduced to REA’s internal arrangements to facilitate the management of the tasks
delegated to it. While Table 32 presents an overview of the Agency’s key success stories
and lessons learned during the evaluation period, they are discussed in more detail
below.

Table 32. The overview of REA’s key success stories and lessons learned in 2015-2018.

Success Stories Lessons Learned

HR and organisational reorganisation Systematic ex ante controls for the FP7 SME actions

Introduction of remote evaluations Adjustment of the internal control system

Contributions of the REA networks Strengthening of the role of the Staff Committee

Contributions to the roll-out of SEDIA Development of a new policy feedback strategy

Source: compiled by PPMI.

Key success stories

HR and organisational reorganisation

REA redefined its organisational structure and adopted additional HR management


measures in view of the adoption of the Horizon 2020 framework and the reorganisation
of the Commission in 2014 and the subsequent extension of its mandate 136. As a result,
the Agency ensured appropriate staffing levels across all of its units and the appropriate
ratio of administrative and operational staff. No cases of business continuity challenges
were mentioned during the interview programme.

Although no major organisational changes occurred in Departments A and B working with


programme implementation since 2015, two units in Department C underwent
restructuring. Two sectors under Unit C3 were divided into three: Legal Validation of
participants, Financial Validation of participants and Legal and Financial verification. In
addition, other functions such as accountancy, officers responsible for budget, ex post
audits, HR, ICT and other functions comprise Department C. Unit C4 was restructured
from two sectors to three: Expert contracting, Expert payments and Evaluation Support.
The Agency is governed by the Steering Committee and the Director. Please refer to the
organisational chart of REA in Figure 47 below.

136
The Agency started to fully implement its new mandate with an increased portfolio of activities (namely the
FET-Open, SC2, SC6, SC7-cyber security, SwafS and SEWP programme) and a wider range of administrative
and logistical support services, extended to additional clients in 2015 as discussed in section 3.2.
219
Figure 47. Organisational chart of REA (as of April 2018).
Note: SME legacy management is shared among Units A5, B2 and B3. In addition, Unit C3 sectors provide legal and financial
validation to participants (rather than 3rd parties). Source: Adapted from:
https://ec.europa.eu/info/sites/info/files/rea_organisational_chart_16_may_2018_web.pdf, accessed in April 2018.

In addition to the changes in its organisational structure, REA optimised its processes to
accommodate the substantial increase in the number of proposals submitted and the
resulting higher workload and lower success rates. One notable measure introduced was
the workload measurement exercise, which facilitated reallocations of staff between
different units. Furthermore, REA grew continuously taking advantages of various new
HR measures to adjust the units’ staffing to their needs with additional actions and
220
services. For instance, the Agency relied on the CAST procedure for the recruitment of
contract staff and the inter-Agency market for the recruitment of TAs.

The Agency also adopted a comprehensive HR strategy in 2017 to link its strategic HR
objectives with REA’s overall strategic multi-annual objectives, and to support the
implementation of the action plan adopted in response to the 2016 Commission Staff
Satisfaction Survey results. The Agency also devoted considerable efforts to
progressively improve its HR operations by making use of various monitoring and
reporting measures.

Moreover, REA collaborated with its parent DGs, the CSC and other services to fine tune
the business processes and update documents for Horizon 2020 as a whole, and
contributed to the development of the IT systems by DG DIGIT and the CSC, which led to
a fully electronic workflow for expert contracting and payment process. Refer to section
4.1.2 for more details on simplifications and various workload management measures
introduced.

Introduction of remote evaluations

In response to the steadily rising number of submitted proposals and the space limitation
for large calls, REA introduced fully remote evaluation procedures for the calls with high
numbers of proposals, reserving the central evaluations taking place in Brussels only for
the most difficult and complex cases and also for panel meetings in 2016. As discussed in
section 4.1.2, this was one of the main changes that affected the Agency during the
evaluation period.

To accommodate this gradual change from mainly on site to largely remote evaluations,
REA introduced a number of improvements. The expert briefings were improved through
the development of short animated videos which improved the quality of briefings and
made them more attractive. REA further developed the manual for evaluators, organised
extensive web-briefing as well as question and answer web-sessions. The evaluation
workflows were adjusted throughout the years to accommodate the fully remote process
and to include an additional quality check of the reports. The facilities for teleconferences
were advanced when the function of teleconferences, and particularly the discussion
module at the consensus stage, were integrated in SEP evaluation tool. This proved to be
particularly useful for critical cases and for accelerating late consensus discussions (e.g.
MSCA ITN 2018). In 2018, REA decided to create additional polyvalent meeting rooms
with suitable audiovisual equipment/facilities to compensate for the loss of the seventh
floor137, increase the maximum capacity of the space managed by REA to 750 experts
(from 580), and to further facilitate the remote participation of experts who cannot
attend on site evaluations138. In addition, the new rooms allowed for more central panel
review discussions (after the full remote process).

The use of remote evaluations allowed REA to accommodate the steadily rising number of
submitted proposals despite the space limitation for large calls. In addition, this change
brought numerous benefits to the experts and the Agency during the evaluation period.
For instance, travel and accommodation costs as well as daily allowances for experts
were reduced and the office space used for evaluations remained stable independently
from the number of proposals evaluated. REA was able to rely on a wider pool of experts
including those with specific expertise and profiles (e.g. from the industry sector, SMEs,
Third Countries) who had difficulties coming to Brussels during the central consensus
meetings. As a result, not only were the cost efficiency and economy of the process
improved but the quality of the evaluation process was also improved. Additionally, the
increase of the remote evaluation procedure has had a positive side-effect on the

137
Following the transfer of ERCEA’s evaluation premises to the seventh floor of the Covent Garden complex
(COVE) in the course of 2018.
138
REA 2018 Interim report, p. 32.
221
environment. Please refer to 4.1.2 section for a detailed analysis of the evaluation
process.

Contributions of the REA networks

REA’s networks, namely the Network of call coordinators (NCC), Legal Officers (NLO),
Project Officers (NPO), Financial Officers (NFO) and Ethics Correspondents (NEC),
contributed significantly to the Agency’s success during the evaluation period. They
assumed a key role in the development of improvements to numerous REA operations
such as management of calls (from call publication to grant signature) and grants in
H2020. In addition, they contributed to the development of guidance, sharing of good
practices and to ensuring a harmonised implementation of the H2020 control system
provided by the CSC. Below we outline the Agency’s major recent contributions to the
improvements introduced to some of its key operations.

One area where REA's Networks played an important role was the evaluation process and
related procedures. The Network of call coordinators continuously discussed the
experience and issues encountered during evaluation, shared good practices and tackled
difficulties encountered. Some of the notable examples of their specific contributions
included their participation in a task force on "maintaining and improving the quality of
experts", in cooperation with the Network of Project officers, DG RTD and INEA, reporting
back to the CSC Executive committee139. The NCC also contributed significantly to the
harmonisation of the experts’ briefings when they complemented the briefings with topics
relating to conflict of interest and confidentiality issues. Moreover, the NCC held meetings
with DG DIGIT and DG RTD following the IAS audit on H2020 Grant management, which
addressed the use of a common IT solution for the detection of potential conflict of
interest (CoI report in SEP) to discuss the issues encountered by operational units during
their evaluations (crash of the system for ITN, false positive notifications, etc.). As a
result, the new SEP report was presented to the NCC in June 2018.

Another area where REA’s Networks had a significant impact on the Agency’s operations
was the ex ante controls and reinforced monitoring measures, which were further
advanced with the development of the relevant IT tools140. In the context of the IAS audit
on H2020 grant management, the Network of Financial Officers has elaborated on the
standard ex ante controls guidance established by the CSC. The NFO produced detailed
descriptions of the checks that may be performed at each step and also reiterated which
checks should not be performed when following the trust-based approach. Moreover, the
NFO gave presentations to each unit to communicate the guidance and a first update has
been done to take into account the first lessons learned and the updates in COMPASS, in
particular the Risk Log. In order to ensure a proper balance between risk and trust, the
NFO also organised a series of ex ante controls workshops (e.g. on reinforced monitoring,
the types of third parties).

Contributions to the roll-out of SEDIA

As mentioned in section 3.2 and further detailed in the subsequent sections and an in-
depth study on the assessment of the management and provision of the central support
services, REA assumed a major role in the development and roll-out of SEDIA in 2017.
Though the volume of transactions did not reach the CBA estimate in the first semester
of 2018, REA carried out important work in facilitating the on-boarding of new clients in
preparation for the significantly higher workload during the second semester 141.

To ensure a sound and forward-looking management of its resources in line with the
growth of its activities in scope and volume, REA carried out significant preparatory work

139
REA 2018 Interim Report, p. 30.
140
REA 2018 Interim Report, pp. 33-34.
141
REA 2018 Interim Report, p. 37.
222
involving the alignment of business processes, preparation of the IT tool and
development of the necessary organisational structures.

After the extensive preparative work done in 2017, REA still faces important challenges
with respect to various organisational, technical and legal aspects of the project. They
include the development of a common framework, the revision of the Rules on Legal
Entity Validation, LEAR appointment and Financial Capacity Assessment and multiple
ongoing IT developments. Under the authority of the GPSB, DG RTD, REA, DG DIGIT and
other Commission services are developing the above-mentioned solutions. As a result,
one of the key challenges was the timely legal and financial validation of participants due
to the larger number of new clients, including the integration of procurement activities.
The transfer of the participant validations for the whole of the Commission and other
Executive Agencies to REA had to be gradually implemented throughout 2018.

To cope with some of these challenges, REA put in place transitional solutions where
needed relying on the learning by doing approach and its extensive experience in
providing validation services for the Research family and other EU programmes. For
instance, as some IT tools needed for SEDIA had not yet been fully developed during the
first semester of 2018, REA put in place transitional solutions where needed. Another
obstacle experienced by the Agency related to the fact that some clients have not been
ready to integrate their activities with the corporate IT suites that give direct access to
REA validation services. In light of the gradual evolution of SEDIA and its impact on the
IT environment, the Agency has been involved in regular coordination meetings with the
IT teams supporting SEDIA in DG DIGIT and DG RTD Unit J4 seeking to resolve the
emerging issues.

Lessons learned

Introduction of systematic ex ante controls for the FP7 SME actions

On the basis of the results of the desk audit campaigns on SME beneficiaries, the Agency
introduced systematic ex ante controls for the FP7 SME actions to mitigate the risk linked
to the non-recording of RTD performer invoices allowing part of the reservation specific
to the SME programme in the 2015 Declaration of Assurance to be lifted 142. Throughout
the evaluation period, REA continued strengthening ex ante controls for ongoing projects
of the SME actions to make sure that any irregularity is corrected prior to the final
payment of the grants. As a result, this risk has not materialised during the evaluation
period143 and REA was able to maintain a high-level performance when handling ongoing
FP7 projects despite the higher workload, which was associated with the additional ex
ante controls for the SME instrument.

In the implementation of the FP7 programme, REA’s activities involved the processing of
cost claims, the related payments as well as the ex post controls and recovery orders
contrary to H2020. A major risk identified in the SME actions was – as reported in the
Agency’s 2015 AAR – the risk of SMEs not recording the RTD performers' invoices in their
accounts. This risk is associated with the fact that SMEs often mandate the project
coordinator to use their share of the grant to pay directly, on their behalf, the RTD
performer for the agreed prices.

According to the results of the desk audit campaigns launched by REA in 2014 and 2015,
this risk had a high prevalence with a financial exposure of more than 10 %. Accordingly,
REA included a reservation in the 2014 AAR and decided to introduce as from
1 January 2015, specific additional ex ante controls which would be systematically
performed to ensure the appropriate registration of RTD performers' invoices in the
SMEs' accounts, before making any final payment. Throughout the evaluation period,

142
REA 2015 Annual Activity Report, p. 4.
143
It should be noted that under H2020 REA is no longer in charge of the SME actions, thus REA is not required
to report on a detected error rate specific to this action
223
REA continued strengthening ex ante controls for ongoing projects of the SME actions,
applying them for all reporting periods for which final deliverables were submitted as of
1 January 2015.

The Agency effectively reduced financial exposure by putting in place ex ante controls. In
the 2016 AAR, REA estimated the detected error rate at 6.28 % for this segment of the
programme. The financial impact of the systematic ex ante controls performed on the
cost claims before proceeding to the payment remained stable (for instance at more than
EUR 18 million in 2016, which represented a significant increase compared to the years
before 2015)144. None of these cases and events in 2017 or the first semester of 2018
resulted in significant financial exposure for REA and mitigating measures have been put
in place to address the exceptions and non-compliance events reported145. The risk
profile of the SME actions however, remained high over the evaluation period despite the
introduction of the additional ex ante controls.

Adjustment of the internal control system

Following the adoption of the new Internal Control Framework (ICF) by the Commission
in April 2017, REA has assessed its internal control systems and has concluded in its
2017 AAR that the internal control principles are implemented and function as intended.
Nevertheless, the Agency devoted continuous efforts to adjusting its Internal Control
Framework as of 1 January 2018 and to implementing the new ICF. The impacts of the
new ICF were displayed in the smooth running of the Agency’s operations.

REA has started to implement the new ICF by undertaking a systematic examination of
the available control results and indicators, as well as the observations and
recommendations issued by internal auditors and the European Court of Auditors in two
stages. Firstly, the Agency established the draft Internal Control Monitoring Criteria in
the draft AWP 2018, published the summary of the new Framework on the basis of the
Budgweb pages on the REA Intranet, and prepared a comparative (ICS/ICF) inventory of
the structure, procedures and processes in place, including the identification of their
owners in view of the intranet migration exercise. During the second stage, REA
organised five workshops dedicated to different sets of ICF principles, with the
participation of REA management and staff, on a voluntary basis. The workshops aimed
at assessing the current state of the system, identifying strengths and weaknesses and
raising awareness. Regarding REA's Internal Control Monitoring Criteria, the Agency also
took into account the comments received from the central services during the annual
reporting exercise, which, together with the outcome of the workshops, had implications
for the final list of indicators. At the request of the internal control team, the IT sector
has been working on an in-house solution based on Share Point technology which will
allow a more flexible and collaborative approach to monitoring action plans and collecting
input from responsible services. The tool called REA Internal Control Monitoring tool
(ICM) was expected to be operational in the 3rd semester 2018.

Following the review of its internal control systems, the most mature components of the
system were identified, including the Control activities, the Risk assessment and the
Monitoring. In terms of the Control environment and Communication activities, the
maturity level of the principles and their characteristics proved to be less homogeneous.
The existing control measures, which allow the Commission to maintain supervisory
control over REA's operations, include, among others, corporate EC processes (internal
control, financial resources, HR standards etc.), the design – coordinated by the Common
Support Centre – of the corporate control framework for the implemented programmes,
the operational cooperation with the parent DGs, corporate databases and dashboards
available and regular reporting exercises.

144
REA 2016 Annual Activity Report, p. 59.
145
REA 2017 Annual Activity Report, p. 69.
224
Strengthening the role of the Staff Committee

The Staff Committee assumed an important role in various key developments, including
the adoption of new general implementing provisions which introduced a quicker granting
of permanent employment contracts and the 2018 reclassification exercise for TAs and
CAs during the evaluation period. In response to the emerging need for more social
dialogue between the Staff Committee and the management of the Agency, four
meetings a year were held (involving a Director, the Head of Unit responsible for
administrative tasks, the representative from the HR unit and other managers). Through
its participation in key HR events and regular social dialogue held with senior
management of the Agency which began in the 2015-2018 period, the role of the Staff
Committee was strengthened significantly. However, the interviews with REA staff
revealed that the members of the Staff Committee questioned the extent to which their
suggestions were taken into account by REA’s management. REA management on the
other hand referred to statutory restrictions preventing some of these suggestions from
being implemented.

Nonetheless, REA noted in its 2018 AWP that REA management will continue to give full
support to the activities of the REA Staff Committee and continue the social dialogue with
the Committee. Additionally, it will further implement the action plan established
following the 2016 staff satisfaction survey, with a particular focus on awareness on
psycho-social and physical well-being and managerial peer learning and sharing best
practice.

Development of a new policy feedback strategy

As discussed in sections 4.1.1 and 4.1.3, the provisions of the MoU relating to a smooth
and effective feedback were based on a clear allocation of responsibilities and tasks
between REA and its parent DGs along all stages of the project life-cycle. Some of the
parent DGs and REA units enjoyed a level of flexibility (granted by the MoU) in setting
unique working practices at the unit level suiting the needs of the specific portfolio of
every DG/unit especially those with a long-standing collaboration dating back to FP7.
However, the lack of formal agreements had implications for several parent DGs and
units where the collaboration was more recent. In the absence of a unified policy
feedback strategy with clearly established responsibilities and tasks, REA invested
considerably in mapping and identifying the best practices of policy feedback to agree
with its parent DGs on the policy feedback activities that are sustainable to the Agency
while offering most added value to the Commission during the evaluation period.

REA has undertaken numerous actions to strengthen policy feedback during the
evaluation period. Through the development of a document on Strengthening Policy
Feedback in REA Mapping Insights and Recommendations by the PFTF, the Agency
documented good practices in the definition and implementation of policy options. The
PFTF also aimed to design a guidance document on Policy Feedback which would support
REA’s approach in creating a Policy Feedback Structured Mechanism (refer to Figure 38).

The structured policy feedback mechanism described in the study area on coherence and
knowledge management was based on the three guiding principles: 1. Creating value 2.
Maintaining flexibility in managing the relationships between REA and its parent DGs and
3. Becoming agile. In 2017, REA put in place a new mechanism to facilitate structured
dialogue between REA and its Parent DGs to address weaknesses identified in the 2012-
2015 evaluation relating to the absence of structured dialogue. REA has developed an
internal strategy, namely the Project Monitoring and Policy Feedback Task Force, which
was established by the REA Network of Project Officers. Its aim was to map, analyse and
develop recommendations to further strengthen the effectiveness and smooth operation
of REA’s Policy Feedback Function.

The PFTF concluded that while REA’s Policy Feedback Function has been proactive in
complying with the tasks required of it, REA lacks a common formal methodology and
225
well-established common practices in the provision of such feedback. To cope with this,
REA translated the overarching nature of the internal strategy into a detailed document –
the Catalogue of Policy Options, which demonstrated REA’s commitment to further
operationalise the PFF. The catalogue included the categorisation of different alternative
options into policy pull and push measures. The various initiatives discussed in the study
area on coherence and knowledge management above demonstrate REA’s organisational
commitment to improving its capacity to provide coherent and effective policy feedback
by strengthening business processes relating to knowledge management and through the
development of specific tools and instruments. Although some of the resulting actions
were already taking place at the time of the interview and survey programmes in 2018, it
should be noted that they may not necessarily have shown their impact yet at the level
of the policy officers. Please refer to section 4.1.3 as well as the first in-depth study area
for a detailed analysis of the policy feedback activities.

Conclusions

Overall, REA in cooperation with the Commission, effectively and smoothly adjusted its
internal operations and procedures in order to accommodate the key changes and
challenges during 2015-2018. These specific actions adopted during the evaluation period
in the areas of organisational structure and HR management, evaluation procedures and
programme management operations as well as provision of support in the context of
SEDIA are regarded by this study as the Agency’s key success stories. Furthermore, REA
has effectively implemented specific actions/simplifications responding to the lessons
learned in relation to the results of the desk audit campaigns on SME beneficiaries, the
new Internal Control Framework, the need to strengthen the role of the REA Staff
Committee and develop a new policy feedback strategy. These actions and simplifications
brought substantial changes to REA’s internal processes and procedures, in particular,
improving the Agency’s capacity to absorb increases in the workload levels, optimise
delivery of support services, and to harmonise the implementation of delegated
programmes.

226
Annex 6 : Quantitative benchmarking of the Commission’s Executive Agencies

This Annex presents the general results of benchmarking. It is worth mentioning that
although all the Executive Agencies of the Commission manage the programmes
delegated to them, each of these Agencies has its own specificities. For instance, REA
manages the central support services, the EACEA has a specific unit A7 responsible for
Eurydice, and the ERCEA differs from other Executive Agencies in terms of ‘dual
leadership’ exercised by the ERC and the Commission. Four of these Agencies – the
ERCEA, REA, INEA and EASME – belong to the Research and Innovation family of the
Executive Agencies. All the Commission’s Agencies are based in Brussels except for
CHAFEA, which is situated in Luxembourg.

Our quantitative benchmarking of the Commission’s Executive Agencies revealed


substantial differences in their programme management costs. On the one hand, the
performance of the Executive Agencies belonging to the Research and Innovation family
was found to be more efficient. The values of programme management cost in this family
of Agencies (the ratio between their administrative and operational budgets in terms of
payments) were above average compared to the Commission’s Executive Agencies
overall (4.72 %), ranging from 0.89 % (in the case of the INEA) to 3.61 % (in the case
of REA) in 2016. Excluding the costs of the central support services, the total cost of
programme management in REA was 2.58 % in 2016, which is the second smallest value
in the Commission’s Executive Agencies after the INEA (which manages large
infrastructure projects supported under the Connecting Europe Facility and Horizon 2020
programmes).

In 2016, individual staff members at REA managed 20.71 proposals on average


(compared to the average of 19.05 in the Commission’s Agencies) and 10.60 running
projects (compared to the average of 8.67 in these Agencies). If REA’s central support
services are excluded from the total staff, the Agency’s indicators improve to 25.20
proposals per “operational staff” and to 12.90 running projects per “operational staff.”
Therefore, REA remains one of the most efficient Agencies in the Commission in terms of
these indicators.

Table 33. Main performance indicators of the Commission’s Executive Agencies, 2016.

Executive REA ERCEA INEA EASME EACEA CHAFEA Average of


Agency all
Agencies

Key parent DG DG RTD DG RTD DG DG GROW DG EAC DG SANTE


(other parent (EAC, MOVE (RTD, (ECHO, (JUST,
DGs) GROW, (ENER, ENV, HOME, AGRI,
CNECT, CNECT, CLIMA, CNECT) GROW)
AGRI, RTD) ENER,
HOME) CNECT,
MARE)

Staff in 2016 628 461 225 417 442 59 372


(actually
filled)

Operational 516
staff in 2016
(actually
filled)

Operational 1 666.36 1 767.46 5 062.88 1 343.58 674.90 99.68 1 769.14


budget
(million, EUR),
commitments
2016

227
Total 59.74 42.87 21.53 35.51 47.96 8.67 36.05
administrative
budget
(million, EUR),
commitments
2016

Operational 1 642.94 1 457.68 2.447.74 1 013.41 647.32 80.73 1 214.97


budget
(million, EUR),
payments
2016

Total 59.23 42.62 21.87 36.29 46.58 8.16 35.79


administrative
budget
(million, EUR),
payments
2016

Total 42.42
administrative
budget
(million, EUR),
without the
central
support
services, 2016

Proposals 13 005 8 077 1 063 15 498 12 500 348 8 415.17


received in
2016

Total running 6 658 5 459 662 N/A 6 028 254 3 812.20


projects (at
the end of
2016)

Programme 3.59 2.43 0.43 2.64 7.11 8.70 4.15


management
cost (ratio
between the
administrative
and
operational
budget,
percentage),
commitments
2016

Programme 3.61 2.92 0.89 3.58 7.20 10.11 4.72


management
cost (ratio
between the
administrative
and
operational
budget),
payments
2016

Programme 2.58
management
cost
(excluding the
central

228
support
services),
payments

Budget 'per 2.62 3.16 10.88 2.43 1.46 1.37 3.65


head' (million,
EUR),
payments
2016

Budget 'per 3.18


operational
head' (million,
EUR),
payments
2016

No. of 20.71 17.52 4.72 37.17 28.28 5.90 19.05


proposals 'per
head', 2016

No. of 25.20
proposals 'per
operational
head', 2016

No. of running 10.60 11.84 2.94 N/A 13.64 4.31 8.67


projects (at
the end of
2016) 'per
head'

No. of running 12.90


projects 'per
operational
head', 2016

Source: PPMI based on the 2016 Annual Activity Reports of the Executive Agencies; the final report of the CHAFEA evaluation,
and administrative data provided by REA at the end of January 2019.

It is also possible to compare the Commission’s Executive Agencies based on the results
of our retrospective CBA and the Commission’s staff surveys. For instance, the volume of
cost savings generated by REA was EUR 19.81 million in 2015 and EUR 23.46 million in
2016, exceeding the annual savings achieved by EACEA, CHAFEA, ERCEA and INEA (see
the table below). In terms of HR indicators, the engagement of REA staff was 67 %,
which was slightly above the average of the Commission’s Executive Agencies (66 %). As
many as 76 % of the Agency staff were overall satisfied with their job, which was higher
than the average of these Agencies (72 %).

Table 34. CBA and HR indicators of the Commission’s Executive Agencies, 2015-2017.

Executive REA ERCEA INEA EASME EACEA CHAFEA Average of


Agency all Agencies

Cost savings 19.81 14.50 5.68 N/A 17.15 0.01 11.43


to the EU
budget (the
results of
CBA), 2015

Cost savings 23.46 17.87 13.46 N/A 19.50 0.08 14.87


to the EU
budget (the
results of

229
CBA), 2016

Cost savings 23.61 19.65 N/A N/A 20.45 n.a. 21.24


to the EU
budget (the
results of
CBA), 2017

Staff well- 49 58 49 55 47 33 49
being, 2016

Overall job 76 75 71 72 70 68 72
satisfaction,
2016

Staff 67 70 68 70 65 55 66
engagement,
2016

Source: the CBA results of the Executive Agencies’ evaluations carried out by PPMI; the 2016 European Commission Staff
Survey: Analysis of the findings, September 2016.

Furthermore, we also compared grant application and implementation processes in the


Commission’s Executive Agencies based on 2018 survey data. As was indicated in the
evaluation report, overall 86 % of respondents were very satisfied or satisfied with the
Agency’s services in 2018, which was also higher than the overall level of satisfactions
reported by the beneficiaries of CHAFEA (74 %) but somewhat lower than those of
EACEA and ERCEA (both 89 %). The table below shows that REA also achieved good
satisfaction with the specific processes: the application process (80 %), the evaluation
process (68 %), the granting process (87 %), the monitoring process (75 %) and the
grant amendment process (67 %). These values are higher than the levels of satisfaction
achieved by other Commission Executive Agencies, except for a few cases where ERCEA
(for the application and evaluation process) or EACEA (for the grant amendment process)
demonstrated a higher level of satisfaction. Almost all the experts (99 %) contracted by
REA stated that they would certainly or possibly consider working with the Agency in the
future. The satisfaction of experts working for other Commission Agencies was equal
(99 % in the case of CHAFEA) or somewhat lower (97 % in the case of EACEA). Overall,
this comparative analysis indicates that REA runs relatively smooth grant application and
management processes from the standpoint of applicants, beneficiaries and experts
during the evaluation period.

230
Table 35. Safisfaction with the processes of application and grant management in the Executive Agencies (REA, EACEA, CHAFEA and ERCEA)

REA evaluation (2015-2018) (2018) EACEA evaluation (2015-2017)* CHAFEA evaluation The ERCEA H2020 Participant
(2018) (2014-2016) (2018) Survey (2018)

Survey Overal Beneficiarie Unsuccessf Overal Beneficiarie Unsuccessf Overal Beneficiarie Overal Beneficiarie Unsuccessf
question l s ul l s ul l s l s ul
Applicants Applicants Applicants

Overall 76 86 55 79 89 66 74 89
satisfactio
n

The 80 84 70 80 84 76 78 82 92 68
application
process
was clear
and
transparen
t

The 68 80 40 67 79 63 77 69 95 34
evaluation
process
was
transparen
t

The 87 87 87 87 78 78
granting
process
was
transparen
t

The 75 75 72** 72**


process of
monitoring
my project
by the

231
Agency
was
smooth

The grant 67 67 87 87 64 64 62 62
amendmen
t process
was
smooth

Notes: *positive grouping (1-4 in the Likert scale were used for comparison); ** the statement was ”the process of monitoring my project by CHAFEA was clear and transparent“.
Source: surveys of beneficiaries and unsuccessful applicants conducted by PPMI as part of the evaluations of REA, CHAFEA and EACEA; the ERC H2020 Participant Survey carried by the ERCEA. We measured the
number of very satisfied and satisfied respondents unless indicated otherwise.

232
Annex 7: Data related to retrospective CBA

Estimated and actual costs of the in-house (Commission) and the Executive
Agency scenarios

Table 37 and Table 38 below present the results of the analysis of the estimated and
actual costs of the in-house (Commission) and the Executive Agency scenarios.

233
Table 36. SFS estimated costs of the in-house and the Executive Agency scenarios146, EUR.

SFS ESTIMATIONS

2015 2016 2017 2018 Total 2015-2018

No. Cost No. Cost No. Cost No. Cost

In-house scenario

Commission

Title I. Staff 428.0 38 391 600 513.0 46 016 100 593.0 53 192 100 816.6 73 253 900 210 853 700
related
expenditure

TA 299.6 32 356 800 359.1 38 782 800 415.1 44 830 800 571.7 61 743 600 177 714 000

CA 128.4 6 034 800 153.9 7 233 300 177.9 8 361 300 244.9 11 510 300 33 139 700

Title II. 9 844 000 11 799 000 13 639 000 18 781 800 54 063 800
Infrastructure
and operating
expenditure

Total 48 235 600 57 815 100 66 831 100 92 035 700 264 917 500
Commission
cost:

REA

The budget estimations of the in-house scenario were not updated (although the Commission would require additional 51.6 FTEs to manage updated mandate in 2020) in the
146

SFS. Therefore, the in-house scenario’s budget in our analysis was calculated on the basis of estimated No. of staff and CBA/SFS average cost assumptions.
234
Title I. Staff 240.3 14 909 583 197.2 12 197 036 136.1 8 475 904 0 0 35 582 522
related
expenditure

TA 71,9 7 207 472 58,3 5844166.9 41.3 4140035.9 0 0 17 191 675

CA 168.4 7 702 111 138,9 6352869.3 94.8 4335867.6 0 0 18 390 848

Title II. 5 113 824 4 196 613 2 896 344 0 12 206 782
Infrastructure
and operating
expenditure

Total REA cost: 20 023 407 16 393 649 11 372 248 0 47 789 304

TOTAL Titles I 68 259 007 74 208 749 78 203 348 92 035 700 312 706 804
and II

Title III. 8 434 000 8 501 000 7 857 000 7 153 000 31 945 000
Programme
support
expenditure

TOTAL COST 668.3 76 693 007 710.2 82 709 749 729.1 86 060 348 817 99 188 700 344 651 804

Externalisation scenario

REA

Title I. Staff 630 38 113 000 670 39 951 000 684.1 40 520 000 744.5 42 875 000 161 459 000
related
expenditure

TA 157.5 167 170.8 182.8 0

235
CA 472.5 503 513.3 561.7 0

Interim supportive
agents and
trainees

Professional
development and
recruitment costs

Title II. 13 924 000 15 063 000 15 710 000 16 915 000 61 612 000
Infrastructure
and operating
expenditure

Total REA cost: 52 037 000 55 014 000 56 230 000 59 790 000 223 071 000

Commission

Title I. Staff 14.4 14.2 13.9 13.5 0


related
expenditure

TA 13.2 13,3 13.2 13.1 0

CA 1,2 0,9 0,7 0,4 0

Title II. 0
Infrastructure
and operating
expenditure

Total 2 092 000 2 298 000 2 527 000 2 817 000 9 734 000
Commission
cost:

236
TOTAL Titles I 54 129 000 57 312 000 58 757 000 62 607 000 232 805 000
and II

Title III. 8 434 000 8 501 000 7 857 000 7 153 000 31 945 000
Programme
support
expenditure

TOTAL COST 644.4 62 563 000 684.2 65 813 000 698.0 66 614 000 758 69 760 000 264 750 000

ESTIMATED 24 14 130 007 26 16 896 749 31 19 446 348 59 29 428 700 79 901 804
SAVINGS

Source: SFS, annual financial reports, AWPs, own analysis.

Table 37. Actual costs of the in-house and the Executive Agency scenarios, EUR.

ACTUAL

2015 2016 2017 2018 Total 2015-2018

No. Cost No. Cost No. Cost No. Cost

In-house scenario

Commission

Title I. Staff 447.0 39 584 200 536.0 48 174 400 620.0 57 366 800 847.6 81 827 300 226 952 700
related
expenditure

TA 299.6 32 656 400 359.1 39 860 100 415.1 47 736 500 571.7 68 032 300 188 285 300

237
CA 147.4 6 927 800 176.9 8 314 300 204.9 9 630 300 275.9 13 795 000 38 667 400

Title II. 10 281 000 12 328 000 14 260 000 20 342 400 57 211 400
Infrastructure
and operating
expenditure

Total 49 865 200 60 502 400 71 626 800 102 169 700 284 164 100
Commission
cost:

REA

Title I. Staff 240.3 14 737 181 197.2 13 670 352 136.1 9 691 358 0 0147 38 098 891
related
expenditure

TA 71,9 7 092 582 58,3 6 633 322 41.3 4 669 313 0 0 18 395 217

CA 168.4 7 644 599 138,9 7 037 030 94.8 5 022 045 0 0 19 703 674

Title II. 3 422 020 2 875 033 1 836 620 0 8 133 673
Infrastructure
and operating
expenditure

Total REA cost: 18 159 202 16 545 385 11 527 977 0 46 232 564

TOTAL Titles I 68 024 402 77 047 785 83 154 777 102 169 700 330 396 664
and II

Title III. 7 803 491 7 997 991 9 279 174 7 442 680 32 523 336

147
Following the CBA and SFS assumptions, legacy 2007-2013 programmes would be managed by REA until 2017. Any left-over legacy would then be handed over to Commission.
238
Programme
support
expenditure

TOTAL COST 687.3 75 827 893 733.2 85 045 776 756.1 92 433 951 848 109 612 380 362 920 000

Externalisation scenario

REA

Title I. Staff 618 37 584 248 628 42 585 390 693 48 240 856 736 52 212 805 180 623 299
related
expenditure

TA 154 14 595 739 146 16 087 060 163 17 753 527 175 19 792 745 68 229 071

CA 464 19 268 949 482 22 687 161 530 25 882 060 561 28 693 940 96 532 111

Interim supportive 1 329 400 1 554 285 1 735 500 1 100 000 5 719 185
agents and
trainees

Professional 2 390 160 2 256 884 2 869 769 2 626 120 10 142 932
development and
recruitment costs

Title II. 8 800 701 9 155 786 9 351 781 9 920 355 37 228 623
Infrastructure
and operating
expenditure

Total REA cost: 46 384 949 51 741 176 57 592 637 62 133 160 217 851 922

Commission

239
Title I. Staff 14.4 1 495 200 14.2 1 518 600 13.9 1 550 900 13.5 1 578 900 6 143 600
related
expenditure

TA 13.2 1 438 800 13,3 1 476 300 13.2 1 518 000 13.1 1 558 900 5 992 000

CA 1,2 56 400 0,9 42 300 0,7 32 900 0,4 20 000 151 600

Title II. 331 200 326 600 319 700 324 000 1 301 500
Infrastructure
and operating
expenditure

Total 1 826 400 1 845 200 1 870 600 1 902 900 7 445 100
Commission
cost:

TOTAL Titles I 48 211 349 53 586 376 59 463 237 64 036 060 225 297 022
and II

Title III. 7 803 491 7 997 991 9 279 174 7 442 680 32 523 336
Programme
support
expenditure

TOTAL COST 632.4 56 014 841 642.2 61 584 366 706.9 68 742 411 750 71 478 740 257 820 358

ESTIMATED 55 19 813 052 91 23 461 410 49 23 691 540 98 38 133 640 105 099 643
SAVINGS

Source: SFS, annual financial reports, AWPs, own analysis.

240
REA’s operational budget 2014-2018

The tables below provide information on REA’s operational budget 148 2014-2018 by
programme and funding source.

Table 38. Actual REA’s operational budget 2014-2018, EUR million.

2014 2015 2016 2017 2018 Total 2014-


2018

FET OPEN 80.00 81.61 91.45 114.09 189.86 557.00

MSCA 841.64 811.03 829.81 900.65 986.82 4 369.94

ITN 440.17 426.39 437.69 482.33 540.84 2 327.41

IF 243.52 220.80 224.11 256.16 273.01 1 217.60

RISE 70.00 80.00 80.00 80.06 80.00 390.06

COFUND 80.00 83.84 80.01 82.10 80.94 406.89

NIGHT 7,95 0,00 8.00 0,00 12.03 27.98

SPACE 103.69 75.98 102.31 96.76 104.60 483.35

SC2 233.39 151.09 298.18 391.25 396.31 1 470.22

SC6 99.62 97.96 80.64 110.66 119.98 508.86

SC7 147.12 127.59 88.54 143.79 219.83 726.86

SEWP 47.82 66.71 94.00 137.62 138.00 484.14

SWaFS 37.70 56.33 41.02 58.42 61.25 254.72

Experts 47.35 51.72 47.55 53.70 54.08 254.40

Total 1 638.31 1 520.00 1 673.51 2 006.94 2 270.73 9 109.50

Source: data provided by REA.

Table 39. Actual EEA/EFTA and third countries’ contributions, EUR million.

2014 2015 2016 2017 2018 Total 2014-


2018

FET OPEN 0,00 17.84 2,07 0.29 0.50 20.69

MSCA 32.90 70.59 72.18 69.40 83.39 328.46

ITN 11.60 60.95 66.78 59.84 82.44 281.60

148
In executed commitment appropriations
241
IF 21.30 5.80 5.40 7.46 0,01 39.97

RISE 0,00 0,00 0,00 0,00 0,00 0,00

COFUND 0,00 3,84 0,01 2,10 0,94 6,89

NIGHT 0,00 0,00 0,00 0,00 0,00 0,00

SPACE 4.48 0,00 9,33 15.61 0,00 29.42

SC2 7.56 9,55 6.56 22.98 42.50 89.13

SC6 2,98 7.15 0,00 7.27 3,86 21.26

SC7 7.36 3,51 2.51 6.58 23.06 43.02

SEWP 0.24 1,09 3,86 3.42 3.53 12.15

SWaFS 0,00 5.57 0.75 4.80 4.27 15.38

Experts 10.42 12.53 4.81 8.61 5,98 42.35

Total 65.95 127.83 102.07 138.94 167.08 601.87

Source: data provided by REA.

Table 40. Actual EEA/EFTA contributions, EUR million.

2014 2015 2016 2017 2018 Total 2014-


2018

FET OPEN 0,00 0,00 0,00 0,00 0,00 0,00

MSCA 23.87 21.69 20.99 20.01 20.64 107.20

ITN 5.56 21.69 20.99 20.01 20.64 88.89

IF 18.31 0,00 0,00 0,00 0,00 18.31

RISE 0,00 0,00 0,00 0,00 0,00 0,00

COFUND 0,00 0,00 0,00 0,00 0,00 0,00

NIGHT 0,00 0,00 0,00 0,00 0,00 0,00

SPACE 1,64 0,00 4.36 4.38 0,00 10.37

SC2 7.56 1.58 0,00 10.31 4.39 23.83

SC6 2.02 4.07 0,00 3,98 3,86 13.93

SC7 5.55 1,49 0,00 1.21 4.81 13.07

242
SEWP 0,00 0.67 2.88 3.42 2,86 9,83

SWaFS 0,00 0,00 0,00 1,43 1,52 2,94

Experts 1,09 0,07 0,13 1,78 0,00 3,06

Total 41,73 29,56 28,35 46,51 38,07 184,23

Source: data provided by REA.

Table 41. Actual third countries’ contributions, EUR million.

2014 2015 2016 2017 2018 Total 2014-


2018

FET OPEN 0,00 17,84 2,07 0,29 0,50 20,69

MSCA 9,02 48,90 51,19 49,39 62,75 221,26

ITN 6,03 39,26 45,78 39,83 61,80 192,71

IF 2,99 5,80 5,40 7,46 0,01 21,66

RISE 0,00 0,00 0,00 0,00 0,00 0,00

COFUND 0,00 3,84 0,01 2,10 0,94 6,89

NIGHT 0,00 0,00 0,00 0,00 0,00 0,00

SPACE 2,85 0,00 4,98 11,23 0,00 19,05

SC2 0,00 7,97 6,56 12,67 38,11 65,30

SC6 0,96 3,09 0,00 3,29 0,00 7,34

SC7 1,81 2,02 2,51 5,37 18,25 29,95

SEWP 0,24 0,42 0,98 0,00 0,68 2,32

SWaFS 0,00 5,57 0,75 3,37 2,75 12,44

Experts 9,34 12,46 4,69 6,83 5,98 39,28

Total 24,22 98,27 73,72 92,43 129,00 417,64

Source: data provided by REA.

Table 42. Actual third countries’ contributions, EUR million.

2014 2015 2016 2017 2018 Total 2014-


2018

FET OPEN 0,00 17,84 2,07 0,29 0,50 20,69

243
MSCA 9,02 48,90 51,19 49,39 62,75 221,26

ITN 6,03 39,26 45,78 39,83 61,80 192,71

IF 2,99 5,80 5,40 7,46 0,01 21,66

RISE 0,00 0,00 0,00 0,00 0,00 0,00

COFUND 0,00 3,84 0,01 2,10 0,94 6,89

NIGHT 0,00 0,00 0,00 0,00 0,00 0,00

SPACE 2,85 0,00 4,98 11,23 0,00 19,05

SC2 0,00 7,97 6,56 12,67 38,11 65,30

SC6 0,96 3,09 0,00 3,29 0,00 7,34

SC7 1,81 2,02 2,51 5,37 18,25 29,95

SEWP 0,24 0,42 0,98 0,00 0,68 2,32

SWaFS 0,00 5,57 0,75 3,37 2,75 12,44

Experts 9,34 12,46 4,69 6,83 5,98 39,28

Total 24,22 98,27 73,72 92,43 129,00 417,64

Source: data provided by REA.

Table 43. Actual EU budget contributions149, EUR million.

2014 2015 2016 2017 2018 Total 2014-


2018

FET OPEN 80,00 63,77 89,38 113,80 189,36 536,31

MSCA 808,74 740,44 757,63 831,25 903,43 4 041,49

ITN 428,57 365,44 370,92 422,49 458,40 2 045,82

IF 222,22 215,00 218,71 248,70 273,00 1 177,63

RISE 70,00 80,00 80,00 80,06 80,00 390,06

COFUND 80,00 80,00 80,00 80,00 80,00 400,00

NIGHT 7,95 0,00 8,00 0,00 12,03 27,98

SPACE 99,20 75,98 92,98 81,16 104,60 453,92

149
C1 (except EEA/EFTA and third countries’ contributions) + C4/C5
244
SC2 225,83 141,54 291,62 368,28 353,81 1 381,09

SC6 96,63 90,81 80,64 103,39 116,12 487,59

SC7 139,75 124,08 86,03 137,21 196,77 683,84

SEWP 47,58 65,61 90,14 134,20 134,47 471,99

SWaFS 37,70 50,75 40,27 53,62 56,98 239,33

Experts 36,93 39,19 42,74 45,10 48,10 212,06

Total 1 572,37 1 392,17 1 571,44 1 868,00 2 103,66 8 507,63

Source: data provided by REA.

245
Getting in touch with the EU
IN PERSON
All over the European Union there are hundreds of Europe Direct information centres.
You can find the address of the centre nearest you at: https://europa.eu/european-union/contact_en

ON THE PHONE OR BY EMAIL


Europe Direct is a service that answers your questions about the European Union.
You can contact this service:
– by freephone: 00 800 6 7 8 9 10 11 (certain operators may charge for these calls),
– at the following standard number: +32 22999696, or
– by email via: https://europa.eu/european-union/contact_en

Finding information about the EU


ONLINE
Information about the European Union in all the official languages of the EU is available on the Europa
website at: https://europa.eu/european-union/index_en

EU PUBLICATIONS
You can download or order free and priced EU publications from:
https://op.europa.eu/en/publications. Multiple copies of free publications may be obtained by
contacting Europe Direct or your local information centre (see https://europa.eu/european-
union/contact_en)

EU LAW AND RELATED DOCUMENTS


For access to legal information from the EU, including all EU law since 1952 in all the official language
versions, go to EUR-Lex at: http://eur-lex.europa.eu

OPEN DATA FROM THE EU


The EU Open Data Portal (http://data.europa.eu/euodp/en) provides access to datasets from the EU.
Data can be downloaded and reused for free, for both commercial and non-commercial purposes.
The main objective of this study was to assess the effectiveness,
efficiency and coherence of the implementation of parts of the
European Union programmes by REA in the period from 2015 to
2018. The study was organised into the four following tasks:

1. Assessment of the regulatory framework, the mission and


governance of REA;
2. Assessment of REA’s performance in 2015-2018;
3. Cost–benefit analysis;
4. Synthesis, conclusions and policy recommendations.

Overall, the evaluation of REA during 2015-2018 suggests that


the delegation of the programmes to the Agency was justified in
terms of cost savings and value added. The CBA exercise found
that the actual savings of the Agency scenario were higher than
the initial estimates (EUR 105 million compared to EUR 80 million
under the SFS estimates) during this period. REA was effective in
achieving its objectives, and its stakeholders were highly
satisfied with the performance of the Agency. Overall, REA
remained one of the most efficient and cost-effective Agencies in
the Commission during 2015-2018.

Studies and reports

Вам также может понравиться