Browsing by Author "Miles, Rachel A."
Now showing 1 - 20 of 22
Results Per Page
Sort Options
- 2023 Spring Open Forum: Connecting the Opens: Open Access, Open Data, Open Educational Resources, and Establishing Your Online Scholarly PresenceWalz, Anita R.; Young, Philip; Petters, Jonathan L.; Miles, Rachel A.; Surprenant, Aimée (Virginia Tech, 2023-02-20)Join the University Libraries for a presentation for future professors regarding open access, open educational resources, open data, and establishing an online scholarly presence. University Libraries’ faculty Philip Young, Anita Walz, Jonathan Petters, and Rachel Miles will provide a brief overview of each topic, with discussion to follow.
- Creation of an Incentivized Course for Managing Your Online Scholarly IdentityMiles, Rachel A.; Mazure, Emily S. (2024-11)Librarians at a large research-intensive university in southwest Virginia in the United States developed an online asynchronous course on how researchers can manage their online scholarly identity. It explains the importance of understanding and efficiently using scholarly identifiers and profile systems and also guides participants through the process of creating and maintaining scholarly profiles and identifiers, with a goal to have participants complete specific activities. To encourage completion of those activities the course was designed with specific incentives; for example, credits earned from the course can be used by faculty to complete the university’s computer-refresh program, which enables them to acquire a new computer after a four-year period. The content was developed in the institutional professional development Canvas platform and was thus available internally to faculty and graduate students. Participants can self-select which modules to complete. Additionally, participants can submit proof and receive credit for completing specific tasks like registering for ORCID, linking IDs across profile systems (e.g., Scopus, Google Scholar Profile, Elements profile, etc.), completing profile details like education, academic positions, scholarly works, and so on. This course is intended as a pilot that we expect to expand upon. The future goals of the course will be to cover two additional strategies for boosting online scholarly visibility: increasing discoverability and openness of scholarship and promoting work through social media and other online channels.
- Faculty Perceptions of Research Assessment at Virginia TechMiles, Rachel A.; Pannabecker, Virginia; Kuypers, Jim A. (Levy Library Press, 2020-07-07)In the spring of 2019, survey research was conducted at Virginia Polytechnic Institute and State University (Virginia Tech), a large, public, Carnegie-classified R1 institution in southwest Virginia, to determine faculty perceptions of research assessment as well as how and why they use researcher profiles and research impact indicators. The Faculty Senate Research Assessment Committee (FSRAC) reported the quantitative and qualitative results to the Virginia Tech Board of Visitors to demonstrate the need for systemic, political, and cultural change regarding how faculty are evaluated and rewarded at the university for their research and creative projects. The survey research and subsequent report started a gradual process to move the university to a more responsible, holistic, and inclusive research assessment environment. Key results from the survey, completed by close to 500 faculty from across the university, include: a.) the most frequently used researcher profile systems and the primary ways they are used (e.g., profiles are used most frequently for showcasing work, with results indicating that faculty prefer to use a combination of systems for this purpose); b.) the primary reasons faculty use certain research impact indicators (e.g., number of publications is frequently used but much more likely to be used for institutional reasons than personal or professional reasons); c.) faculty feel that research assessment is most fair at the department level and least fair at the university level; and d.) faculty do not feel positively towards their research being assessed for the allocation of university funding.
- Faculty Perceptions on Research Impact Metrics, Researcher Profile Systems, Fairness of Research Evaluation, and Time AllocationsMiles, Rachel A.; Pannabecker, Virginia; MacDonald, Amanda B.; Kuypers, Jim A. (2019-10-09)This survey research study was conducted by the Faculty Senate Research Assessment Committee at Virginia in the spring of 2019 to determine how faculty at Virginia Tech use researcher profiles and research impact metrics as well as the reasons behind why they use them and how they use them; the survey also assessed how faculty perceive research assessment at the department, college, and university levels, and asked their views on the potential integration of their research into a new incentive-based budget model at the university. The results of this study also help to inform institutional policy reform at Virginia Tech. The results of this study and its implications for practice for researchers, librarians, and scientometricians, was presented at the 6:AM (the sixth) Altmetrics Conference in Stirling, Scotland, United Kingdom.
- Follow the Yellow Brick Road? Overcoming Beliefs in Wizard-conjured Data & MetricsMiles, Rachel A. (2022-06-16)As much as we would all like to trust in bibliometrics and research analytics presented to us from visibly intelligent and sophisticated sources and tools, there are those of us expert enough to realize that there is always a wizard behind the curtain. Our awareness of the wizard does not always communicate well to those in Emerald City though, and even when it does, they may acknowledge the wizard but find it difficult to find error in his ways. As bibliometricians and academic librarians, we have all faced a faculty member, researcher, and the administrator who may trust blindly in a proprietary research analytic platform to determine their strengths and weaknesses as a university, college, or other unit. Though these analytic tools provide valuable insights to institutions, it is the over-reliance on these tools that we often struggle to overcome. In this narrative, we sometimes must work with those who are willing to be taken down that metaphorical yellow brick road to discover the wizard behind the curtain themselves. Other times, we find allies and advocates in the university of a more responsible and ethical approach to research metrics. Regardless, when we have patience, we can allow our discussions and their questions to lead us all to a more realistic picture of researcher output and impact. After Glinda the good witch told Dorothy that she only needed to tap her ruby heels together three times to get home, she also pointed out that she could not just tell her this at the beginning of her long journey: “Because she wouldn't have believed me. She had to learn it for herself.” To be clear, the character Dorothy is not a direct or even fair comparison to research administrators, such as Deans and vice presidents. Administrators, faculty members, and researchers are under a great deal of pressure to support the university community, achieve higher rankings, and meet other external evaluator measures. One way we respond to this need is by promoting ways to approach research evaluation and analytics in a wiser, more responsible manner while also assisting administrators and researchers in tracking, analyzing, and highlighting their academic achievements via a number of databases and tools. Furthermore, sometimes we, as librarians and bibliometricians, may find ourselves in the ruby shoes of Dorothy, learning our own lessons about research impact and evaluation from other perspectives. In our own struggles at Virginia Tech to overcome over-reliance on any sort of Wizard of Research Metrics or Data, we work closely with university administrators across various university levels (e.g., Office of the Vice President for Research and Innovation, Provost’s Office, Analytics & Institutional Effectiveness) on analyzing colleges’ and departments’ research output via free and proprietary databases and research analytic tools while also providing explanations of the limitations and caveats of such analytics. Over the past two years, our research impact team was invited to speak at three crucial meetings (among others) with the administrators from two colleges and the Associate Deans for Research group. These meetings helped us to communicate and clarify our research impact services and how we can more comprehensively track and analyze research impact while also calling attention to the limitations of bibliographic data sources. In addition, we also work closely with the Elements Implementation Team, which consists of faculty and staff from the Provost’s Office - Faculty Affairs and Institutional Analytics and Effectiveness, and the University Libraries; this team supports administrators and faculty with using the Symplectic Elements system (a Digital Science tool for tracking scholarly information and for faculty activity reporting). As a result of our efforts, collaborations, and presentations over the past two years, there has been an increase in colleges’ implementation and use of the Elements system. We also began pilot projects with two colleges to more comprehensively track faculty scholarship by collecting, importing, and manually entering scholarly output data from their curriculum vitas and scholarly profiles into the Elements system. Finally, we have collaborated with university governance, policies, and administrative partners (e.g. Faculty Senate, Provost’s Office - Institutional Rankings, Faculty Affairs, Academic Resource Management, the Office of Research and Innovation) to solicit feedback and contribute to ongoing institutional discussions and practices related to research metrics, research assessment, data collection, data use, and methods used to support institutional progress towards world rankings and other external evaluator measures; these efforts have helped us become recognized as experts of research impact on campus, but more importantly, they have offered us a voice at institutional discussions regarding metrics, impact, and incentives. This presentation will detail our struggle to overcome some of the misconceptions of research analytics, research impact, and research metrics; to convince administrators and faculty members of the importance of varied bibliographic data sources and data entry; and influencing the larger community through university governance efforts.
- How Do Academic Librarians Use Research Impact Metrics? Guest post by Rachel MilesMiles, Rachel A. (The Bibliomagician, 2019-05-08)I’m part of a research team that wanted to test whether claims about librarians’ love for altmetrics were actually true. Along with Sarah Sutton (Emporia State University, Kansas, USA) and Stacy Konkiel (Digital Science, Minnesota, USA), I helped survey US librarians to determine the actual awareness and usage of altmetrics among academic librarians in the USA. We also surveyed librarians about their awareness and use of other types of research impact indicators like citation counts, the Journal Impact Factor, and qualitative impact evidence. Our study (published recently in the Journal of Librarianship and Scholarly Communication) was the first large-scale, national study of its kind. Librarians previously on the tenure track were much less likely to use altmetrics in their tenure and promotion dossiers, than academic librarians currently on the tenure track Some of the most interesting results from this study include: -Academic librarians with regular scholarly communication duties are likelier to use research impact indicators, compared with other academic librarians; -There’s a growing interest among US academic librarians in using altmetrics as an indicator in promotion and tenure dossiers at institutions that offer tenure for librarians; and -Faculty tenure and promotion requirements tend to influence the likelihood of librarians addressing JIF and citation counts during consultations
- Metrics beyond Impact: New Approaches for the Novice ResearcherMacDonald, Amanda B.; Miles, Rachel A. (2019-09-04)
- Navigating 21st Century Digital Scholarship: Open Educational Resources (OERs), Creative Commons, Copyright, and Library Vendor LicensesSeibert, Heather; Miles, Rachel A.; Geuther, Christina (Taylor & Francis (Routledge), 2018-06-09)Digital scholarship issues are increasingly prevalent in today’s environment. We are faced with questions of how to protect our own works as well as others’ with responsible attribution and usage, sometimes involving a formal agreement. These may come in the form of Creative Commons Licensing, provisions of the U.S. Copyright Act, or terms of use outlined by contractual agreements with library vendors. Librarians at Eastern Carolina University and Kansas State University (K-State) are among several university libraries now providing services to assist with navigating these sometimes legalistic frameworks. East Carolina University Libraries are taking initiatives to familiarize faculty, researchers, and students with Open Educational Resources (OERs) and Creative Commons Licensing (CCL). At K-State, librarians in digital scholarship and electronic resources identified the overlap of their subject matters through their correspondence regarding users’ copyright and licensing questions; a partnership formed, and they implemented a proactive and public-facing approach to better meet user needs and liability concerns at a major research university.
- Open Access Forum 2020: Connecting the Opens: Open Access, Open Educational Resources, and Open DataBriganti, Jonathan; DePauw, Karen P.; McNabb, Kayla B.; Miles, Rachel A.; Mueller, Derek; Roy, Siddhartha; Sridhar, Venkataramana (Virginia Tech. University Libraries, 2020-10-19)Join faculty presenters from around the university, University Libraries faculty, and the Preparing the Future Professoriate graduate class in a robust discussion about nuances, similarities and differences in the "opens." This event begins with brief discussions of open access (OA), open educational resources (OER), and open data before situating this conversation within open access trends in the U.S., Europe, and at Virginia Tech. Presenters and panelists include Jonathan Briganti (University Libraries), Karen DePauw, (Graduate School), Kayla McNabb (University Libraries), Rachel Miles (University Libraries), Derek Mueller (English), Siddartha Roy (Civil and Environmental Engineering), Venkat "Sri" Sridhar (Biological Systems Engineering).
- Open Education Forum 2021: Connecting the Opens: Open Access, Open Education & MoreMcNabb, Kayla B.; Miles, Rachel A.; Wolfe, Mary Leigh; DePauw, Karen P.; Ogejo, Jactone Arogo; Tucker, Thomas J. (Virginia Tech, 2021-03-01)Join faculty presenters from around the university, University Library faculty, and the Future Professoriate Graduate class in a robust discussion about nuances, similarities and differences in the "opens." Learn about open access (OA) trends in the U.S., Europe, and at Virginia Tech. Learn about the differences between open access and open educational resources (OER). Presenters and panelists include Karen DePauw (Dean, Graduate School), Jactone Ogejo (Biological Systems Engineering), Mary Leigh Wolfe (Biological Systems Engineering), Thomas Tucker (School of Visual Arts), Kayla McNabb and Rachel Miles (University Libraries). Slides from this presentation are available at http://bit.ly/pfpopened2021
- Research Assessment & Bibliodiversity: Curation of Open Educational ResourcesMiles, Rachel A. (Association of College and Research Libraries, 2023-03-16)As academic librarians and bibliometric practitioners are faced with increasing pressure to conduct advanced research analytics, provide research impact reports to administrators and research managers, and interpret complex bibliometric, scientometric, and altmetrics data, the need for education and professional development has sharply risen. As a result, a new collection of open educational resources is being curated to address this growing demand. This presentation will provide a high-level overview of the collection and its items, the organization of the collection and its diverse topics, and the collection's background, as well as a brief summary of the larger scholarly communication project under which it is nested. Throughout the presentation, the speaker will emphasize the importance of bibliodiversity and responsible research assessment, how the two are intertwined, and how they must be incorporated into all assessment practices to effectively advocate for inclusivity and valuation of diverse scholarship across disciplines, backgrounds, and underrepresented groups. Learning Outcomes: To summarize the importance of incorporating bibliodiversity and responsible research assessment in all research impact reports, analytics, and guidance To describe a curated collection of existing, centrally located open educational resources on research impact To implement the items in the collection into your own practices and interactions with administrators, research managers, and faculty members.
- Researcher profiles: Teaching students to cultivate a successful online presenceMacDonald, Amanda B.; Miles, Rachel A. (2020-02)Students engaging in undergraduate research have the opportunity to learn from a mentor, often preparing them for future careers in industry or professional programs. While these students gain high-level, technical research skills, students often do not consider or create marketable deliverables that they can use in the future to influence their career trajectory and curate their online identity. In this session, presenters will work with attendees on pedagogical approaches for teaching undergraduate researchers or others engaging in similar high-impact practices to synthesize their personal, professional, and academic experiences in order to develop research profiles related to career goals and objectives.
- Responsible Research Evaluation 2024: Summary of the SCOPE WorkshopWolf, Baron G.; Miles, Rachel A. (NCURA, 2024-08)Over the past six years, the INORMS Research Evaluation Working Group has worked closely with groups across the academic sector to help them use the SCOPE Framework in their own research evaluation exercises. In 2023, the Institute of Museum and Library Services (IMLS) awarded a grant (grant number: LG-254850-OLS-23) to Dr. Baron Wolf, Assistant Vice President for Research and Director for Research Analytics at the University of Kentucky, and Co-PI Rachel Miles, Research Impact Coordinator at Virginia Tech University Libraries, to workshop the SCOPE Framework in the US at a two-day, in-person forum that brought together librarians, researchers, university administrators, and research managers and provided formal training in making strategic decisions using research evaluation methods. The forum took place in Albuquerque, New Mexico, March 13-14, 2024 (https://evaluationforum.uky.edu).
- 'Responsible use of what?': Navigating US university governance to approve an institutional statement on the responsible use of metricsMiles, Rachel A. (2023-06-08)Despite the first international initiative on responsible research assessment, the 2012 Declaration on Research Assessment (DORA), beginning in the US, only one entire US university and four individual units within universities have signed DORA. Furthermore, few US universities have their own institutional statements on the responsible use of research metrics. The lack of visible commitment to responsible research assessment in the US can be partly attributed to the decentralized governing and funding systems of US universities. Unlike the UK and other countries, each US university has its own governing and budget models, with state universities’ funding at least partly reliant on the individual state, and its politics, where it resides. In this presentation, the speaker will talk about her experiences and lessons learned from navigating one US university’s governance structure to form a faculty-led responsible research assessment task force, and then draft, communicate, and approve the university’s first statement on the responsible use of research metrics.
- Rumor Has It: How Exploring Research Engagement through Metrics Transforms Student LearningMacDonald, Amanda B.; Miles, Rachel A. (2019-10)Increasingly, scholars are finding that their disciplines and sub-fields overlap and complement one another, leading to more cross-disciplinary and transdisciplinary collaborations and research projects. A Research Impact Librarian and an Undergraduate Research Services Librarian at a major research institution in the southeastern United States discovered that the overlap in their fields could enhance undergraduate researcher skills and expand the use and purpose of altmetrics. Traditional library instruction often focuses on digital and information literacy skills through keyword development, use of Boolean operators, database navigation, and proper citing of sources but rarely covers concepts related to citation metrics or altmetrics. Unconventional and innovative approaches to library instruction show students that research is not a profession; it is a life skill. Research can, of course, be a major part of someone’s profession, but those who teach research literacy skills have the opportunity to imbue a sense of independence and competence in students unfamiliar with the scholarly conversation. Healthy skepticism, curiosity, exploration, vetting of sources, emotional self-awareness, and a general understanding of human behavior are lifelong research skills that are constantly being honed, reassessed, and developed. Bibliometrics and altmetrics can augment students’ research skills by offering a window into the discussions surrounding research. Librarians can offer a more analytical and critical approach to their research instruction sessions by helping students interpret and decipher the meaning and context behind the metrics. While bibliometrics and altmetrics are traditionally used to assess individual researchers, research institutions, industries, scholarly journals, scholarly societies, and other groups of researchers, this interactive workshop will demonstrate how participants can use altmetrics to teach undergraduate students to engage in the scholarly conversation, develop topics, understand seminal works, evaluate sources, and investigate the motivations behind research metrics in both academic and public spheres.
- Scholarly Presenting & PublishingMiles, Rachel A. (2022-04-12)This is an overview of scholarly presentations and scholarly publishing, which includes preparing and designing presentation materials, tips for presenting, resources and tutorial links for learning more. The overview of scholarly publishing includes investigating the journal's scope, aim, peer review process, turnaround time, and metrics; topics covered also include publisher copyright policies, Open Access, institutional repositories, and scholarly profiles.
- Teaching Undergraduates to Collate and Evaluate News Sources with AltmetricsMacDonald, Amanda B.; Miles, Rachel A. (Association of College and Research Libraries, 2021-08)In the digital age of information, undergraduate students often have a difficult time identifying and differentiating among online sources, such as news articles, blog posts, and academic articles. Students generally find these sources online and often struggle to vet them for consistency, context, quality, and validity. In this chapter, we present a new purpose for altmetrics in which librarians teach undergraduates to use altmetrics as a tool to evaluate and differentiate between online mainstream and scholarly sources, which can lead to a deeper understanding of the research process and the engagement and discussion surrounding research as well as an increased ability to evaluate sources more critically. On a more advanced level, students will be able to analyze different levels of inaccuracy and misrepresentation of research from mainstream sources and more accurately identify highly sensationalized research topics from mainstream sources, seminal works of research, and deliberately misleading information and/or fake news. Slides for the learning activity are available at https://sandbox.acrl.org/library-collection/using-altmetrics-evaluate-pseudoscience-news-media
- Using Altmetric Data Responsibly: A Guide to Interpretation and Good PracticeMiles, Rachel A.; Price, Robyn (2023-10-12)This guide focuses specifically on data from the data provider and company, Altmetric, but other types of altmetrics are mentioned and occasionally used as a comparison in this guide, such as the Open Syllabus database to find the educational engagement with scholarly outputs. This guide opens with an introduction followed by an overview of Altmetric and the Altmetric Attention Score, Altmetrics and Responsible Research Assessment, Output Types Tracked by Altmetric, and the Altmetric Sources of Attention, which include: News and Mainstream Media, Social Media (X (formerly Twitter), Facebook, Reddit, and historical data from Google+, Pinterest, LinkedIn, and Sina Weibo); Patents, Peer Review, Syllabi (historical data only), Multimedia, Public Policy Documents, Wikipedia, Research Highlights, Reference Managers, and Blogs; finally, there is a conclusion, a list of related resources and readings, two appendices, and references. This guide is intended for use by librarians, practitioners, funders, and other users of Altmetric data or those who are interested in incorporating altmetrics into their bibliometric practice and/or research analytics. It can also help researchers who are going up for annual evaluations and promotion and tenure reviews, who can use the data in informed and practical applications. It can also be a useful reference guide for research managers and university administrators who want to understand the broader online engagement with research publications beyond traditional scholarly citations, also known as bibliometrics, but who also want to avoid misusing, misinterpreting, or abusing Altmetric data when making decisions, creating policies, and evaluating faculty members and researchers at their institutions.
- Using Altmetrics to Explore the Scholarly Conversation with UndergraduatesMiles, Rachel A.; MacDonald, Amanda B. (2019-06-07)An emerging tool called altmetrics allows students to more fully engage with and evaluate the conversations surrounding scholarship. Altmetrics represents the online attention to research from sources such as news media outlets, social media, public policy documents, reference managers, blogs, Wikipedia, patents, syllabi, and more. Altmetrics has previously been used by researchers, industry, and research institutions to track the public engagement of scholarship, but this pioneering approach combines undergraduate library instruction with the scholarly conversation to demonstrate the expanding scope of altmetrics in the classroom. The speakers will show how altmetrics can be used to discover broader conversations around research, evaluate scholarly information, and analyze the short-term and long-term influence of scholarly works both in academic and public spheres.
- Virginia Tech Research Impact & Intelligence Team: An Integrative Approach to Research Analytics and Responsible Assessment PracticesMiles, Rachel A.; Stovall, Connie; Over, Sarah (2022-09-22)The current landscape of academic incentives, rewards, and assessment practices is beginning to shift and evolve, and though change is slow, it is critical to a healthy academic ecosystem. At Virginia Tech, a large research institution and state university in southwest Virginia, United States; librarians Rachel Miles, Connie Stovall, and Sarah Over of the newly formed Research Impact & Intelligence (RII) Team are helping to ensure that the Virginia Tech research community can competently, ethically, and expertly assess research productivity and impact, benchmark across and against departments, colleges, and other universities; and assess individual faculty members fairly and responsibly. This poster presentation will outline the services, projects, and initiatives provided by the RII Team and how they connect to and complement one another. For example, Rachel has focused much of her work on responsible research assessment efforts within university governance structures and is currently chairing a Faculty Senate Task Force to write the first responsible research assessment statement of principles for the university. In addition, Connie and Rachel work together and individually on research impact reports and research analytics for groups across campus. They also consult with administrators from the Office of the Vice President for Research and Innovation, Faculty Affairs, and Academic Resource Management to make recommendations on decisions regarding the use of bibliographic data and bibliometrics. Together, these initiatives and services inform the Virginia Tech research community about its research activities and impacts while providing guidance on proposed policies, the incentive-based budget model, and nuanced bibliographic data interpretation.