Strengthening research impact: the LIFT model for leaders and managers
Strengthening research impact: the LIFT model for leaders and managers
There are many resources on bridging research and policy, but not so many on how to lead and manage others in this domain. That was a focus of a recent webinar for research leaders at the University of Nottingham, facilitated by Nottingham’s Institute for Policy and Engagement, and by On Think Tanks, the global platform for supporting policy research. The Powerpoint is here, with links to resources from ODIOverseas Development Institute (London) and other places. The first part of the presentation deals with the general field, and will be familiar to those who have followed my work, but the LIFT model is new.
Leaders and managers have a natural interest in strengthening research impact. It is good for the reputation of their institutions, and for the reputation and profile of individual researchers. Further, in the UK at least, it is a criterion used in allocating public funding: the guidelines for the Research Excellence Framework assessment, scheduled for 2021, state specifically that the exercise will assess ‘the reach and significance of impacts on the economy, society, culture, public policy or services, health, the environment or quality of life that were underpinned by excellent research . . .’, with a weight in the overall assessment of 25 per cent.
Even more important than all this, researchers and research institutions in many disciplines have an intrinsic commitment to making the world a better place. They do not want their research sitting on a shelf.
There can, however, be a gap between aspiration and practice. Researchers may lack skills or institutional support. And they may not feel recognised or rewarded for making a commitment to policy engagement. This is where the LIFT model can help senior staff set a course and provide practical support.
LIFT. Leadership. Incentives. Facilities. Training.
On Leadership, it goes without saying that good work on policy should be recognised and celebrated. But more than that, impact needs to be embedded in overall strategy and planning. Not all institutions have Mission Statements, but those that do need to make sure that policy impact is enumerated alongside other objectives like high quality research and teaching. For think-tanks, this is especially important. At ODI, in my time, we defined our mission as follows:
‘ . . . to inspire and inform policy and practice which lead to the reduction of poverty, the alleviation of suffering and the achievement of sustainable livelihoods in developing countries. We do this by locking together high-quality applied research, practical policy advice and policy-focused dissemination and debate. . .’.
A Mission Statement provides an overall vision, with the benefit that staff, including new recruits, can be in no doubt as to the purpose of the organisation. It then needs to be translated into strategy and annual Business Plans, including for units and programmes. Business Plans need to specify public affairs outputs, even outcomes, with resource allocations to match. Formats differ, but Business Plans will often specify what policy questions are being tackled, what research questions follow, and what policy engagement options will be pursued. Sometimes, this will be straightforward, because funding has been secured in advance, thus reducing uncertainty. In other cases, the Business Plan will set a direction of travel and will require adjustment as resources are secured: doing so does not, however, remove the imperative of fixing on the final destination.
The public affairs team will also have an entry in the Business Plan, of course. In an ideal world, there will be more bids for their support than can be accommodated, which will encourage some friendly competition and institutional prioritisation. Which, for example, will be the flagship reports for the year? A commitment to advance planning of this kind also reminds researchers that they need to build in a public affairs plan at the beginning of major projects.
On Incentives, researchers are unlikely to throw themselves enthusiastically into policy if all the institutional incentives are for them to concentrate on publishing academic articles or on teaching. This means that policy engagement needs to be specified in individual work programmes and that it needs to form part of appraisal and evaluation. Some institutions have adopted complex quantitative measures to assess performance across different domains, others use a more qualitative balanced score card, which personally I have always found more appropriate: rather than trying to calculate just how many newspaper op-eds are equivalent to a single journal article (which newspaper? Which journal?), better to have a group of evaluators make an informed judgement about overall performance. The rewards can be monetary (promotion, additional increments, bonus payments) or non-monetary.
An important issue is whether objectives, assessment and reward will be individual or team-based. Probably some combination is best. In the policy arena, that allows for the fact that some researchers may make significant contributions but find public affairs difficult.
If researchers are to make policy impact, they need institutional support. Some things they can do on their own – a Twitter account, a Facebook page, a personal email distribution list – but mostly they will want to make use of institutional formats. That means institutions (departments, programmes) need websites, social media platforms, policy timetables, media, political and policy-related contact lists, and a variety of other ‘products’, standardised for institutional identity. These can include Briefing Papers for policy-makers, Opinion pieces, Blogs, public or semi-public meetings, press releases, and the like. Editing support will be needed. And public affairs teams will normally want some control over outputs, in order to secure quality: the ‘brand’ of an institution, its reputation among policy-makers, is vital to having influence.
Public affairs input has to be paid for, and this is often a delicate subject within institutions. Is the public affairs infrastructure an overhead charge on researchers, adding to their fund-raising targets? Or can it be funded from project revenues? The answer is that project funding is highly desirable, and should be built into budgets whenever possible, but that a stable, minimum infrastructure needs to be guaranteed, and will require overhead funding.
Some researchers turn out to be ‘naturals’ at communication and policy influence. However, it cannot be assumed that all researchers, especially young recruits, will be expert authors of policy briefs, or brilliant on TV. Indeed, some of the most important outputs, like writing policy briefs, turn out to be among the hardest things researchers are asked to do. That is why institutions need to invest in training and mentoring.
One tool that can be especially useful is the ‘Story of Change’, a kind of after-action review of policy engagement, describing what was done, and analysing lessons. They can help participants learn lessons from their own work. If institutions develop a portfolio of such cases, Stories of Change can also provide a repository of good ideas, and be turned into training cases.
It would be useful to have comments on these issues, and personal experiences, even Stories of Change. Please do make use of the comments box. See also the blog by Stephen Meek, who leads the Nottingham Institute and has set up a ‘policy academy’ (with On Think Tanks) to support researchers at his University.