This is a guest blog by Dr Rebecca Taylor-Grant, where she describes recent work on the TIER2 Editorial Reference Handbook. For more information, please see the publication:
Taylor-Grant, Rebecca & Cannon, Matthew & Lister, Allyson & Sansone, Susanna-Assunta. (2025). Making reproducibility a reality by 2035?: Enabling publisher collaboration for enhanced data policy enforcement. International Journal of Digital Curation. 19. 10.2218/ijdc.v19i1.1064.
Why is it so difficult to introduce new data policies into the publishing workflow? Why can’t something simple like a checklist improve adherence? Making research data FAIR (Findable, Accessible, Interoperable, and Reusable) is one of the best ways to improve the reproducibility of research. Many journals have data-sharing policies to support this, but how can we continue to support the implementation and enforcement of stringent policies?
In this article we described a cross-publisher initiative undertaken as part of the TIER2 Project which mapped the publishing workflow and identified the various individuals and teams who handle a manuscript from submission to publication. We then layered a set of rigorous data checks on top, making sure that every check was conducted at a logical point in the flow, and by someone with the appropriate knowledge and skills to check it – it wasn’t as simple as it sounds! We believe this approach could be useful for all kinds of projects and collaborations.
These checks became a core part of the Editorial Reference Handbook (https://publishers.fairassist.org/), and are aimed at guiding journal editorial teams to operationalise a set of checks that foster good sharing data practices in manuscripts.
Development
Academic journals have supported Open Science through the implementation of data sharing policies for over ten years; some evidence has since emerged on the additional time, resources and expertise that policy enforcement requires as part of an editorial workflow. As part of the co-creation process, publishers and academics examined practical and pragmatic ways to increase the FAIRness and reproducibility of published research through a series of collaborative workshops.
The workshops were aimed to identify the key checks needed to enforce strengthened journal data sharing policies and to understand which editorial roles have the capacity to undertake such checks. The intended outcome of this work was to establish the workflows and resourcing which can support academic journals to enforce stronger data sharing policies in future.
The resulting Editorial Reference Handbook has three main components:
- Guidance: A web-based manual outlining rationale, definitions, implementation tips, and consensus roles at each manuscript submission stage.
- Checklist: A structured spreadsheet containing 13 specific checks at both Manuscript and Digital object levels.
- Flowchart: A consensus workflow showing when to apply each check, the role identified as the best fit for each check, and what actions to take on pass/fail.
The Handbook fills a gap by providing unified, actionable guidance around the publishing of digital research objects (e.g. research data) as part of manuscript submission. Although shown with consensus roles and workflow stages as identified by publishers, it is flexible and can be applied to diverse editorial workflows and used by the variety of individuals and teams who handle a manuscript.
Intervention
The use of the Handbook is being piloted by a number of journals during Summer 2025. Surveys of ‘positive control’ journals and publishers have just been completed to supplement and add context to the intervention outcomes. The ongoing intervention pilot, happening across several journals, aims to document what may need to change or improve to successfully implement these checks in terms of:
- in-house capability (e.g., needing more knowledge about how to run them),
- opportunity (e.g., needing support to apply them), and
- motivation (e.g., needing to prioritise them).
The Handbook can play a key role in enhancing future cross-stakeholder collaborations to advance Open Science and research integrity. The core set of checks within the Handbook addresses a seemingly small gap but, applied consistently, it could have a real impact on how publishers collaborate with other stakeholders to improve Open Science practices.