Вы находитесь на странице: 1из 4

Developing research communications capacity: lessons from recent experiences

Mendizabal Ltd

Enrique Mendizabal (onthinktanks.org) Martine Zeuthen (Integrity Research)

Key recommendations
Do not take demand for granted: Demand from research organisations should be confirmed by those contracted to provide capacity development support. Often it can be based not on a real interest to learn and adapt but on a desire to please and manage relations with donors. Consider organisational culture and context in any needs assessment: Instead of focusing only on technical considerations, needs assessments should enquire into the culture of the organisation, its attitude towards different forms of communication, its business models, the roles of key individuals within it, and how its external context affects its choices in communications tactics and channels. Build on what works: Even the weakest organisations communicate with their stakeholders. Support ought to be built around the competencies and skills that the organisations already have and seek to add new components, tools, and links to other communication approaches.

Background
What is the best approach to develop the research communications capacities of researchers and their organisations? This is a question not often asked by research funders and service providers. The Overseas Development Institute (ODI) (delivered by the Research and Policy in Development (RAPID) programme) and the International Network for the Availability of Scientific Publications (INASP), however, did. In 2011, both organisations designed and delivered a capacity development project to improve the communications capacity of several African research grantees of the International Development Research Centre (IDRC). Both organisations were independently approached by IDRC to support its grantees to 1) communicate the research they had produced as part of their involvement in IDRCs ACACIA programme, and 2) develop their own organisational research communication capacities. Enrique Mendizabal and Martine Zeuthen reviewed both initiatives and provided feedback to ODI and INASP. This document is an attempt to bring those reviews together and share the lessons learned with a wider audience.

The approaches
ODI and INASP pursued different approaches, each informed and affected by a number of internal and external factors. In ODIs case this project presented an opportunity to test some of the lessons the ODI team had identified in previous collaborations with other IDRC programmes; while for INASP it was an opportunity to explore a new area of work looking more at the supply rather than the demand side for research based evidence, given that its experience lies in research literacy and working with policymakers.

ODIs approach comprised of three phases: The first phase involved a series of webinars with all three of the organisations ODI was supporting. During the webinars, the ODI team presented the RAPID Outcome Mapping Approach as well as several more specific research communications tools. The webinars were supposed to be followed by the grantees working on their communication strategies; and these were to be reviewed by ODI during country visits. In practice, the webinars could not be implemented as planned and did not lead to draft strategies. As such, the field visits, in effect, constituted the second phase of the project. During these visits, the ODIs consultant reviewed the content of the webinars and worked with the researchers and others in their organisations to develop the draft strategies. The third phase was a follow-up phase during which some of the grantees worked on their strategies. INASPs approach was markedly different from ODIs. Its project began with a long diagnostic which involved several interviews with the grantees. This informed a workshop for all the grantees and in which IDRC had an active participation. The workshop was followed up by the grantees developing strategies and INASP providing support to them in their implementation. This support was provided, in some instances as not all strategies were followed through, through direct mentoring. Both projects made important and valuable contributions to their clients but could not meet their full potential.

Lessons learned from the assessment


The following lessons are drawn from the reviews of ODIs (Mendizabal and Zeuthen, 2012) and INASPs (Mendizabal and Zeuthen, 2012) interventions as well as an evaluation of ODIs support to IDRCs GGP programme (Mendizabal, 2009), and RAPIDs capacity building experience (Mendizabal, Datta and Young, 2011). They are intended, however, to be relevant for other capacity development initiatives and efforts. The best laid plans In both cases, as well as in other cases consulted for the purpose of this review, the interventions did not go as planned. There is little that ODI or INASP, as service providers, could control and several grantees faced conflicting demands, lost interest, or were simply not capable of taking advantage of the services offered by the service providers. An expression of interest does not always imply commitment: Although the grantees had expressed their interest in being involved in the projects several were not engaged in learning and did not change their approach to research communications as a consequence. In one case one of the grantees expresses that their involvement was based on the impression that the project appeared to be important for IDRC and ODI. In other words, their participation was driven more by an interest in being part of such initiative to satisfy donor demands rather than in the initiative itself. Researchers have other interests and pressures besides communications: A related set of lessons, well known by now, is that most researchers are not only often more interested in researching than communicating but the business models of their organisations often demand that they spend as much time as possible seeking and delivering new projects. Additionally, while an individual project may be a priority for the donor or for the lead partner, it is unlikely to be so for individual organisations or researchers. As a consequence, any activities that are not seen to directly support their core business are unlikely to be given the priority they demand to be effective. Face-to-face is better than virtual, but the web is a good alternative: ODIs original proposal had been to host the grantees for a few weeks to give them a chance to meet the team in charge of research communications and even participate in some activities. The idea was rejected and instead the webinars were introduced as an alternative. They worked well but not as well as INASPs event. The event did not just offer an opportunity

for face-to-face engagement between INASP and the grantees but also ensured that the latter were focused on the objectives of the project and not distracted by their day to day responsibilities; which, in the case of ODI, contributed to not meeting the full potential of the webinars offered. If it is not done at the beginning, then it is probably too late: In all cases the researchers were either done or about to finish with the research. The project came about as a final activity for the grantees, added to the project with only months to go. Furthermore, while the support provided was intended to lead to a communications strategy, there were no additional funds to implement such a strategy. As a consequence, researchers had few incentives to engage more than necessary. The right people matter: Both ODI and INASP intended to develop the capacity of the networks or organisations involved and not just that of the individuals who participated in the capacity development projects. The ambition was for the people receiving the support to then go on and train or mentor other members of their networks or organisations. Unfortunately, those who participated where not always the right people for this objective. Senior researchers, network coordinators, and even communicators may be excellent candidates to make use of any skills learned during the webinars or workshops but that does not necessarily make them the most appropriate trainers of trainers. This was particularly obvious in cases where the participants were coordinators of time-bound networks that had only been set up for the purpose of delivering the ACACIA programme that was by then coming to an end. Local or regional facilitators and mentors: INASPs approach involved using regionally based facilitators and mentors. This had a particularly positive effect on the project. The partners learned from the mentors and enjoyed discussing the specific challenges that they were facing with regional professionals. Conversely, ODI was able to connect with the grantees it was supporting only after visiting their offices, and concerns about the consultants lack of familiarity with their context were raised. No one is starting from scratch: All the grantees, to different degrees, have some sort of research communications capacity. In some cases, their personal and professional networks ensure greater levels of impact than any formal research communication strategy could ever promise. Furthermore, many communication tactics and channels that are common for developed countries or the United Kingdom, and that ODI and INASP are more familiar with, may not be appropriate for the grantees contexts. Hence regardless of the level of professionalism or resources awarded to research communications none were starting from scratch.

Recommendations
The following recommendations are intended to inform future efforts to develop research communications capacity but could also be applicable to the development of other competencies and skills. Start early right from the beginning: Developing the capacity to communicate should not come as an afterthought. Funders must plan this right from the start and service providers like ODI and INASP should be careful about being involved if this is not the case. In these circumstances, the support should target future projects. Confirm demand before starting: Even before signing a contract, the service providers should contact the grantees and effectively treat them as clients; inquiring as to their interests, concerns, and commitment to the initiative. The service providers must be very clear regarding the time and resources that they will have to allocate to the process. They must also discuss, at length, who are the most appropriate people to be directly involved and what will be their responsibilities.

More than a needs assessment: really understand the organisation and its context: The service provider should start by either spending time with the organisation or hosting the relevant people. It is often possible, for example, for a researcher or a communications officer to work remotely hosted by the other and simply learn by being there. Above all, the service providers need to understand the culture of the organisations and the policy contexts they seek to affect. This is not something easily achieved through a remote diagnostic. Consider who is the most appropriate source of expertise: It may be that the organisations conducting the assessment are not necessarily the most appropriate when it comes to delivering the support. Would they limit their recommendation to the services they can offer? Build on strengths: Recognising that the organisations already communicate their work, the service providers should seek to either improve what they already do or introduce new channels or tactics that build on those that they are comfortable with. This is likely to make a bigger impact than if the consultants bring along an entirely foreign and allencompassing new approach. Focus on the organisation rather than on single projects: Support should be aimed at strengthening the organisations capacity and not just a single projects visibility. This is likely to attract the support of senior managers that is crucial for any change to take hold within the organisation. The project it self can be used as a pilot to text the new tactics or channels proposed. Earmark funds to implement whatever strategy they develop: It is unlikely that the organisations will dedicate the necessary time to develop a strategy or plan unless they know that there will be funds available to implement it. Just as the service providers are not helping for free, it is unlikely that these researchers will be able to dedicate the necessary time to the initiative unless their time is covered. The service provider should therefore make sure that there are sufficient funds for this purpose. On the other hand, if the organisation has the funds but is not willing to allocate them to this purpose this should be seen as a sign that there is little buy-in from the leadership. Maximise peer-to-peer exposure: Depending on the kind of skills being shared and the individuals involved, the donors and service providers should attempt to ensure that people with the right experience deliver the support. Researchers, for example, are more likely to respond to other researchers; communication officers to communication officers; and managers to managers. This means that it is possible that the service providers will have to look beyond their organisations for the right expertise. Instead, they may act as facilitators and help the organisations find the most appropriate people for their needs.

Developing research communications capacity: lessons from recent experiences by Enrique Mendizabal and Martine Zeuthen is licensed under a Creative Commons Attribution-NonCommercial-ShareAlike 3.0 Unported License.

Вам также может понравиться