Wednesday, August 26, 2015

The review was chaired by James Wilsdon, professor of science and democracy at the University of Sussex, supported by an independent and multidisciplinary group of experts in scientometrics, research funding, research policy, publishing, university management and research administration. Through 15 months of consultation and evidence-gathering, the review looked in detail at the potential uses and limitations of research metrics and indicators, exploring the use of metrics within institutions and across disciplines.

The main findings of the review include the following: 

  • There is considerable scepticism among researchers, universities, representative bodies and learned societies about the broader use of metrics in research assessment and management.
  • Peer review, despite its flaws, continues to command widespread support as the primary basis for evaluating research outputs, proposals and individuals. However, a significant minority are enthusiastic about greater use of metrics, provided appropriate care is taken.
  • Carefully selected indicators can complement decision-making, but a ‘variable geometry’ of expert judgement, quantitative indicators and qualitative measures that respect research diversity will be required.
  • There is legitimate concern that some indicators can be misused or ‘gamed’: journal impact factors, university rankings and citation counts being three prominent examples.
  • The data infrastructure that underpins the use of metrics and information about research remains fragmented, with insufficient interoperability between systems.
  • Analysis concluded that that no metric can currently provide a like-for-like replacement for REF peer review.
  • In assessing research outputs in the REF, it is not currently feasible to assess research outputs or impacts in the REF using quantitative indicators alone.
  • In assessing impact in the REF, it is not currently feasible to use quantitative indicators in place of narrative case studies. However, there is scope to enhance the use of data in assessing research environments. 

Go to source: http://www.hefce.ac.uk/pubs/rereports/Year/2015/metrictide/