VTechWorks staff will be away for the Independence Day holiday from July 4-7. We will respond to email inquiries on Monday, July 8. Thank you for your patience.
 

Follow the Yellow Brick Road? Overcoming Beliefs in Wizard-conjured Data & Metrics

Files

TR Number

Date

2022-06-16

Journal Title

Journal ISSN

Volume Title

Publisher

Abstract

As much as we would all like to trust in bibliometrics and research analytics presented to us from visibly intelligent and sophisticated sources and tools, there are those of us expert enough to realize that there is always a wizard behind the curtain. Our awareness of the wizard does not always communicate well to those in Emerald City though, and even when it does, they may acknowledge the wizard but find it difficult to find error in his ways. As bibliometricians and academic librarians, we have all faced a faculty member, researcher, and the administrator who may trust blindly in a proprietary research analytic platform to determine their strengths and weaknesses as a university, college, or other unit. Though these analytic tools provide valuable insights to institutions, it is the over-reliance on these tools that we often struggle to overcome. In this narrative, we sometimes must work with those who are willing to be taken down that metaphorical yellow brick road to discover the wizard behind the curtain themselves. Other times, we find allies and advocates in the university of a more responsible and ethical approach to research metrics. Regardless, when we have patience, we can allow our discussions and their questions to lead us all to a more realistic picture of researcher output and impact. After Glinda the good witch told Dorothy that she only needed to tap her ruby heels together three times to get home, she also pointed out that she could not just tell her this at the beginning of her long journey: “Because she wouldn't have believed me. She had to learn it for herself.” To be clear, the character Dorothy is not a direct or even fair comparison to research administrators, such as Deans and vice presidents. Administrators, faculty members, and researchers are under a great deal of pressure to support the university community, achieve higher rankings, and meet other external evaluator measures. One way we respond to this need is by promoting ways to approach research evaluation and analytics in a wiser, more responsible manner while also assisting administrators and researchers in tracking, analyzing, and highlighting their academic achievements via a number of databases and tools. Furthermore, sometimes we, as librarians and bibliometricians, may find ourselves in the ruby shoes of Dorothy, learning our own lessons about research impact and evaluation from other perspectives. In our own struggles at Virginia Tech to overcome over-reliance on any sort of Wizard of Research Metrics or Data, we work closely with university administrators across various university levels (e.g., Office of the Vice President for Research and Innovation, Provost’s Office, Analytics & Institutional Effectiveness) on analyzing colleges’ and departments’ research output via free and proprietary databases and research analytic tools while also providing explanations of the limitations and caveats of such analytics. Over the past two years, our research impact team was invited to speak at three crucial meetings (among others) with the administrators from two colleges and the Associate Deans for Research group. These meetings helped us to communicate and clarify our research impact services and how we can more comprehensively track and analyze research impact while also calling attention to the limitations of bibliographic data sources. In addition, we also work closely with the Elements Implementation Team, which consists of faculty and staff from the Provost’s Office - Faculty Affairs and Institutional Analytics and Effectiveness, and the University Libraries; this team supports administrators and faculty with using the Symplectic Elements system (a Digital Science tool for tracking scholarly information and for faculty activity reporting). As a result of our efforts, collaborations, and presentations over the past two years, there has been an increase in colleges’ implementation and use of the Elements system. We also began pilot projects with two colleges to more comprehensively track faculty scholarship by collecting, importing, and manually entering scholarly output data from their curriculum vitas and scholarly profiles into the Elements system. Finally, we have collaborated with university governance, policies, and administrative partners (e.g. Faculty Senate, Provost’s Office - Institutional Rankings, Faculty Affairs, Academic Resource Management, the Office of Research and Innovation) to solicit feedback and contribute to ongoing institutional discussions and practices related to research metrics, research assessment, data collection, data use, and methods used to support institutional progress towards world rankings and other external evaluator measures; these efforts have helped us become recognized as experts of research impact on campus, but more importantly, they have offered us a voice at institutional discussions regarding metrics, impact, and incentives. This presentation will detail our struggle to overcome some of the misconceptions of research analytics, research impact, and research metrics; to convince administrators and faculty members of the importance of varied bibliographic data sources and data entry; and influencing the larger community through university governance efforts.

Description

Keywords

Citation