The future is bright…

…if the funding is right, as David Coombe, director of conference planning at ARMA explains

As much as one tries not to be driven by the Research Excellence Framework (REF) – and successful institutions (whatever their mission) will ensure that they are not – the sheer complexity of the process and intensity of the final push toward the submission date inevitably seem to take up all one’s attention. 

In the past, the year or two after the REF (or RAE as it was then) was a time for putting the logistical complexities of the process behind you and turning your attention to other things. Those other demands have not gone away – if anything, they seem only to become more pressing – but this time there really is no getting away from the REF. When the dust settled on the RAE 2008, administrators up and down the land were to be heard muttering ‘Never again…’: fill in the blank with
the particular aspect of the process which fits your situation.

For most of us, the problem was the inadequacy of our ICT systems to support the preparations and the final submission. Networking events thereafter have been dominated by chats in the coffee break over systems developments and off-the-shelf options. It is clear the market has matured quite considerably since then – you only have to see the number of exhibitors at events such as the ARMA Annual Conference and the number of delegates taking advantage of the demonstrations as evidence of that (and the number of companies now working on integrated CRIS solutions by development or by merger) to see just how far the sector has come since the RMAS project was conceived to meet a very real need – though take-up across the sector suggests that there is still a long way to go before the market is saturated.

But that was as far as it went. Generally one could expect to spend the first half of the next RAE cycle pretty much focusing on anything but.

This time the issue will be ‘impact’ and preparations for the next REF have begun before the submission button was pressed on this one. Over the last three years all of us have been in reactive mode, ferreting out the impact that had happened over the last six years quite without any interference from REF strings (my own institution was established in 1895 ‘for the betterment of society’, and arguably this has been our purpose, and research only the means, ever since) and seeking desperately to divine the meanings of the panel criteria and the likely values of the panels. And just how does one provide accurate attribution of impact to research, still more corroboration of impact?

We will learn more when the results are released and submissions published in a year’s time. But few institutions will wait until then to turn a reactive process into a proactive one. Notwithstanding the 2* threshold for underpinning research, we will continue to seek to undertake only the highest-quality research and to rely on impact only from that. But we, like others no doubt, will seek to learn the lessons of this REF and turn them to our advantage by ensuring we provide the environment in which high-quality research has the very best chance of making an impact, in tracking that impact back to the research (the ‘significance’ question) and tracking it beyond the immediate (in time, in place) to the long term and the widespread (the ‘reach’ question); and in recording corroborations as they occur.

Each RAE/REF exercise has brought challenges of its own. In this one the change of name meant little; the introduction of impact significantly more. Just when we feel we have some grasp of the impact process, HEFCE introduces a new pitfall: the prospect of open access as a condition of eligibility for submission to the next REF. 
We support open access and recognise the value of the REF as a tool in bringing about desired behaviours, but if HEFCE proceeds with this without first allowing the publishing world to catch up, it runs the very real risk of skewing publishing practices in ways which run counter to the interests of the standing of UK research across the world or of excluding some of the very best research from the assessment process, or more likely both. I return to my opening point that successful institutions will seek to produce the very highest-quality research and ensure that that research is recognised as such before acceding to any artificial restrictions imposed by the REF.  That’s where institutions and specifically we as research managers can most effectively play a part in working with HEFCE to ensure the integrity of the process.

The truth is, as much as we must work against it, the REF does affect behaviours – just as funders’ policies and priorities do. Professor Dame Judith Rees will be speaking at the 2014 ARMA Conference reflecting just how much ‘research incentives’ in all their forms have influenced research practices over recent history. If the REF has worked against interdisciplinarity in the past – and there are many reasons for believing it has – funders’ growing emphases on grand challenges, often large-scale, collaborative projects with real-world applications and innovations in mind, are working in the opposite direction.
The danger is that the introduction of impact assessment in the REF will do the same, but without reform only by creating two different types of REFable research: that which leads to high-quality publications and that which leads to significant impact. Institutions and research managers will continue to need to steer a course through these.

All additional regulation or compliance measures, whatever they are, create the need for more research managers. We have seen that in the past with the introduction of TRAC/fEC, governance arrangement (the Department of Health’s Research Governance Framework as an earlier example; the Concordat more recently), the growing need for ‘advocacy’ in financially straitened times, pathways to impact, time sheets, European Commission bureaucracy as well as more welcome complexities: a growing realisation that international collaborations sometimes need to be ‘facilitated’ (while recognising they cannot be ‘managed’), a greater emphasis on training researchers and support researchers’ careers. All research offices will be able to track how they have grown in size over the last couple of decades; support for the administration of the REF was once a one-person job; despite simplification measures this time around, it now takes a dedicated team. 

The role of the research manager is changing, as more and more institutions invest in research facilitators to support the complex processes that research development often involves, to provide more pre-award support (with specialisms across the funding bodies) and improved post-award support (to meet increasingly stringent reporting requirements). The pressures keep piling up as new initiatives take hold. Our next priority will be open data, hot on the heels of open access. And we’ll keep a wary eye on the profile afforded citations and other metrics in the assessment of quality (and the concomitant need for more complex management tools and the rise of the scientometrician).

We will adapt if – as is the risk at the time of writing – the science ring-fence turns out not to be a ring-fence after all (what is a ring-fence if it can be withdrawn?).
All this will be worth it if it keeps the UK at the forefront of international research. Others have observed that government controls increase in inverse proportion to the funding it provides. Other European governments are loosening their grip on universities and providing greater freedoms of the sort traditionally enjoyed in the UK and the US.

While the stricter funder conditions and growing compliance burden is aligned with research interests, the research manager will add value to the research process and institutions will recognise the value of the regulations and the need for the research manager. For the moment institutions will continue to focus on the highest-quality research above all else in anticipation that funders and the REF exercise will recognise this. If that ever changes, that will be the sign to funders and policy-makers to turn back.

David Coombe is the director of the research division at the London School of Economics and Political Science, and director of conference planning for the Association of Research Managers and Administrators (ARMA) UK:





we have limited spaces available for this webinar