Harmonising Research Metrics and Indicators Across the Ecosystem
Highlights of ICOR Public Meeting #7, 13 November 2024
Please see the streaming video and chat record for rich details from speakers and meeting attendees.
Introduction
This ICOR public meeting aimed to explore effective strategies for communicating research impact across various contexts. As a recent blog post on the evolving landscape of research assessment emphasizes, “responsible research assessment and open scholarship are intrinsically linked.” This close relationship highlights the need for compatible and complementary indicators to provide meaningful evidence for hiring, promotion, and funding decisions.
As we transition towards a broader assessment of impact, promote open scholarship, and uphold responsible research practices, it’s crucial for the community to collaborate on a balanced and holistic approach to ensure fair and accurate research assessment.
To initiate the discussion, Anna Hatch, Program Officer, Open Science Strategy at the Howard Hughes Medical Institute (HHMI), introduced the topic and our three expert speakers whose presentations demonstrated how, as a community, we can learn from one another when developing and using new indicators and metrics for research assessment.
Research metrics – shared goals for all
Kelly Cobey, University of Ottawa Heart Institute and Co-Chair of DORA [slides; streaming video 5-32 min]
Kelly kicked things off by highlighting the shared ‘problem’: the current system of measuring research pointing out how traditional metrics, like the number of publications or journal impact factors, often prioritize quantity over quality. Kelly explained how this ultimately stifles innovation, discourages collaboration, and makes it harder for researchers to focus on truly impactful work.
Kelly spoke about how DORA is leading the charge to change this by promoting more practical and fair ways to assess research. They’re working to ensure that the system values a wider range of contributions, including not only open science, but also patient engagement, and equity, diversity and inclusion. Kelly introduced DORA’s practical guidance on using quantitative indicators that highlight the inherent limitations of quantitative metrics and addresses concerns related to impact factors, h-index, and altmetrics by emphasising the importance of transparency, the potential biases of such indicators, and the need to consider the reductive nature of quantitative measures.
Kelly then discussed project TARA (Tool to Advance Research Assessment), which is facilitating the development of new policies and practices for academic career assessment and providing an implementation guide for academic institutions to support research assessment reform. Kelly finished by stating DORA are focused on collaborating with international groups to foster cooperation and knowledge exchange around new and existing metrics and will continue to work with the global community on improving research assessment.
The emergence of structured CV narratives
Karen Stroobants, CultureBase Consulting [slides; streaming video 32-54 mins]
Karen explained her role as vice chair of COARA, the organisation’s history and and the commitments that have been established to provide a common direction for research assessment reform. Karen then turned her focus to the second commitment ‘Base research assessment primarily on qualitative evaluation for which peer-review is central, supported by responsible use of quantitative indicators‘ and explained COARA’s vision of the appropriate place for quantitative indicators across various levels of assessments.
This led to how the emergence structured CV narratives shift the focus from quantitative metrics to qualitative impact as they allow researchers to tell their stories in a more engaging and meaningful way. She explained how the narrative format emphasizes the impact of research, accommodates diverse contributions, and can reduce bias in hiring and promotion decisions. She then highlighted how value-driven implementations of narrative CV’s have been introduced by (mainly European) funders and slowly institutions are also implementing structured CV narratives to help promote a more equitable and inclusive research culture.
Karen concluded by sharing her personal ‘worry’: a future where research assessment becomes so focused on numbers and metrics that we lose sight of the bigger picture. However, Karen’s optimism foresees that future research assessment will encourage researchers to better reflect on their work and that a more holistic approach to assessment will lead to positive changes in research practices.
Responsible data metrics as an opportunity to redefine evaluation metrics
Iratxe Puebla, Make Data Count [slides; streaming video 54-72 mins]
Iratxe started by introducing Make Data Count, a community effort aiming to develop open data metrics to assess data usage. She highlighted the unique nature of data, explaining that unlike traditional outputs like journal articles, data is used in a variety of ways that aren’t always captured by traditional metrics. Through funder and institution use cases, she emphasized that for measuring data there has to be a balanced approach, combining quantitative and qualitative metrics with consideration of what is being evaluating and where human judgment is essential.
Iratxe went on to discuss the importance of context when using data metrics. She stressed the need for an iterative approach, recognising that the presence of certain characteristics may or may not indicate data quality and that distinguishing between use and reuse is also crucial, as the ways researchers and others engage with data continues to evolve.
Iratxe’s perspective on data metrics offered valuable insights that can be applied beyond the realm of data. A balanced and thoughtful approach to metrics and indicators can be beneficial in various aspects of research assessment.
Summary
The meeting highlighted the shared interest and goal within the research ecosystem: the development and use of effective metrics and indicators. As new metrics are developed to measure open scholarship practices, it’s crucial to maintain a balance between quantitative and qualitative approaches. By considering context and avoiding an overreliance on ‘the numbers’, we can ensure that the spectrum of metrics and indicators truly reflect the impact of research.
Please see the video recording for access to the full meeting.
ICOR Community members: please suggest topics or volunteer to host future public meetings on this Google Form.