Usage and metrics
Abstract
These guidelines will help Diamond OA publishers and journals implement the Diamond Open Access Standard’s (DOAS) requirements regarding usage and metrics.
The implementation of comprehensive, accurate, and reliable usage and metric indicators, along with clear communication about analytical tools and methodologies, allows Diamond OA publishers and journals to ensure that all its journals maintain high standards of transparency and accountability. Compliance with data protection regulations further safeguards user privacy and enhances the credibility of the collected data and resultant metrics.
Main Text
Assessing a journal's quality and impact of published content, particularly in the context of open access (OA) and open science, requires a varied approach. Traditional metrics remain relevant, but responsible metrics have emerged to capture the broader influence and reach of open access publications.
Responsible metrics highlight the importance of a balanced, transparent, and inclusive approach to evaluating research. By employing a diverse array of metrics, integrating qualitative assessments, and prioritising transparency and fairness, responsible metrics aim to more accurately reflect the multifaceted impact of scholarly work. By adopting this approach, Diamond OA journals and publishers support the broader goals of open science, promoting an environment where research can be freely shared, accessed, and valued based on an assessment of its actual impact.
To support diversity and inclusivity in measuring the impact, it is recommended to avoid Journal Impact Factor (JIF) as a single measure of journal quality and impact of the published content. Instead, Diamond OA publishers and journals should use a variety of quantitative and qualitative indicators to capture a broader spectrum of research impact, including citation metrics, altmetrics, peer reviews, and societal impact measures. There is also a need to recognise that different disciplines have different norms and impact measures.
It is recommended that the methodologies behind metrics be transparent and understandable so that researchers know how their work is evaluated. Selected metrics should use open datasets whenever possible to allow verification and further analysis by the academic community.
Complementing quantitative metrics with qualitative evidence, such as peer review comments, case studies, and expert opinions, to provide context for the numbers can ensure narrative context. In this sense, open peer review, providing free access to the reviewer's reports, should be given more consideration and adopted whenever possible.
The metrics used should be regularly reviewed and updated to reflect changes in the research environment and technological advancements. These metrics can indicate impact (which can be based on citations or alternative metrics), usage, or efficiency of editorial workflow and decisions (like submission, acceptance and publication dates or rejection rates). The granularity of these metrics can also vary, as they may relate to specific content items like articles or the entire publication.
Citation Metrics
Counting the total number of times an article is cited in other scholarly works.
Altmetrics
Tracking how often research is mentioned or shared on social networking platforms like X, LinkedIn, and Facebook. Monitoring appearances in blogs, news articles, and other media sources. Measuring how often articles are saved or bookmarked in reference managers like Mendeley. Tracking engagement through comments on publisher websites, discussion forums, and academic networks.
PIDs are crucial for calculating alternative metrics (altmetrics), which track online attention and engagement.
Usage Metrics
To comprehensively measure a journal's impact and its published content, especially in the context of open access and open science, it is essential to use a combination of traditional and alternative metrics (altmetrics). These measures should capture the academic influence through citations and the broader societal impact through downloads, mentions, and engagement on various platforms. Implementing these diverse metrics enables a holistic view of the journal’s reach, influence, and contribution to the scientific community and society.
Diamond OA publishers should provide metric indicators at the article and journal level, including visits, views, downloads and citations, altmetric data and geographical distribution of visitors. These metrics may require different levels of effort, technical expertise, and support, or they may involve financial resources. Some metrics are relatively undemanding from a technical perspective, such as recording submission, acceptance, and publication dates, while others may necessitate special software or plugins, like widgets displaying the geographical distribution of visitors. Finally, some metrics represent proprietary solutions that could require funding, such as Altmetric, PlumX Metrics, or Dimensions citation badges.
Editorial workflow
The editorial workflow is fundamental to maintaining the quality, timeliness, and integrity of scholarly publishing. It benefits all stakeholders involved (authors, reviewers, editors, and readers) by ensuring that research reports are disseminated smoothly, effectively, and ethically, maintaining high standards of scholarly rigour.
There are several components of the editorial workflow that can be monitored and measured, like timeliness of the initial review of manuscript submission (checks for completeness and adherence to submission guidelines, ensuring that the manuscript fits the journal scope and meets basic quality standards), assignment to an editor or associate editor with relevant expertise, selection of appropriate reviewers and management of the peer review process, supervision of the revisions and resubmission of the revised manuscript, the decision on final acceptance, copyediting, formatting and proofing, typesetting and layout, assignment of persistent identifiers (DOI etc.), publication of articles, distribution and indexing in relevant databases, promotion and dissemination of the published content, and monitoring the impact metrics.
The efficiency of the editorial workflow can be measured by the turnaround time of submission to decisions, reviewer response time and quality of reviews, editorial decision consistency and fairness, the number of appeals and their outcomes, production speed, and author and reviewer satisfaction. To improve the editorial workflow, the time between different stages of the editorial process should be continuously evaluated, tracked and measured, the quality of the peer review and author revisions should be assessed, as well as author and reviewer satisfaction. It is important to identify stages in the editorial workflow where delays frequently occur and implement strategies for improvement. Continuous evaluation of the editorial management system and other technologies (plagiarism checks, automatic reminders, etc.) should be in place.
Analytical Tools
An analytical tool refers to a combination of measuring, acquiring data, analyzing, and reporting on data to estimate journal usage, impact, and metrics. By using analytical tools, journals can gain a comprehensive understanding of their usage, impact, and influence within the academic community and beyond. Besides the quality of published content and editorial workflow, the information provided by analytical tools is crucial for improving journal policies, enhancing visibility, and demonstrating value to authors, readers, and funders.
Most used analytical tools are based on popular citation databases, like Web of Science Core Collection (WoSCC) and Scopus, providing various metric indicators. Elsevier's analytical tool SciVal provides visualisation of the research performance, benchmarking relative to peers, and allows for trend analysis. The visibility and impact of journals in different disciplines can also be assessed by the freely accessible, but not always reliable, Google Scholar (GS) citation metrics.
Different analytical tools go beyond citations and track online attention and discussions around scholarly articles, including mentions on social media, news outlets, policy documents and blogs, like Altmetric. Besides citations and mentions/reactions on social media, Plum Analytics offers a range of metrics on usage and captures (export/saves, readers, bookmarks, favourites). Recently, Dimensions has offered a research insights platform that integrates different data sources to provide metrics on publications, citations, grants, patents, clinical trials, and policy documents.
Although many popular analytical tools are available only at high subscription prices, there are also free analytical tools, like CORE, which aggregates open-access research outputs and provides metrics on their usage and impact, VOSviewer and CiteSpace for bibliometric analysis, Open Access Button (offering also tools for libraries), and various tools for usage data on institutional repositories. Additionally, commercial analytical tools sometimes offer a limited free version, offer freely accessible browser extensions (Altmetric, Unpaywall) or free access to a subset of features and data (Dimensions). There are also publisher-specific tools available, like CrossRef Metadata Search, providing citation links and some basic metrics.
Diamond OA publishers ensure transparency about the analytical tools and methodologies used to collect, generate, and analyze data, providing the description of tools, algorithms, and methodologies used. With respect to GDPR Compliance, they must ensure that all data collection and processing activities comply with relevant data protection regulations, such as the General Data Protection Regulation (GDPR) in Europe. It is necessary to implement measures to protect the privacy of users whose data is being collected. This includes anonymizing data where applicable and ensuring secure data storage.
Related toolsuite articles
Related guidelines
Related training materials
References
- Altmetric. https://www.altmetric.com/solutions/free-tools/
- Citespace. https://citespace.podia.com/
- COnnecting REpositories (CORE). https://core.ac.uk/
- Crossref Metadata Search. https://search.crossref.org/
- DOI Foundation. Digital Object Identifier. https://www.doi.org/
- European Commission. General Data Protection Regulation. https://gdpr.eu/what-is-gdpr/
- Google Scholar. https://scholar.google.com/
- Open Access Button. https://openaccessbutton.org/
- SciVal. https://www.scival.com/landing
- VOS viewer. https://www.vosviewer.com/
- Unpaywall. https://unpaywall.org/
- Web of Science Core Collection. https://clarivate.com/products/scientific-and-academic-research/research-discovery-and-workflow-solutions/webofscience-platform/web-of-science-core-collection/
Licensing
This article is made available under a Creative Commons Attribution 4.0 International License