owned this note
owned this note
Published
Linked with GitHub
# Overview / Philosophy
Climate informatics, like many other communities and fields, has software at its heart. Underlying most publications is a novel piece of software playing some critical role, e.g., embodying a model, processing or analysing data, or producing a visualisation. In order for such software artefacts to have the most impact, they should be available, functional, and reproducible, such that other researchers can benefit from the work, verify the claims of the paper, and then build upon it to do more great work. These ideals are summarised by the FAIR principles of data, which can be applied to software: research software should be Findable, Accessible, Interoperable, and Reusable. In order to help promote [FAIR software](https://www.nature.com/articles/s41597-022-01710-x), Climate Informatics is embarking, for the first time, on an Artefact Evaluation phase following the standard peer-review process. Artefact Evaluation provides an opportunity to embed the values of reproducibility into the publication process in a lightweight opt-in fashion, thus encouraging authors to make software available and the results of the paper reproducible.
A committee of reviewers, the Artefact Evaluation Committee (AEC), will review the submitted artefacts against three criteria: is the software available? is it functional, and can it be used to reproduce the (central) claims or thesis of the paper?
# Timeline and Process
Authors of full-papers that are accepted to the CUP journal proceedings will be encouraged (but not required) to submit supporting materials for Artefact Evaluation following the Climate Informatics conference, with a deadline of: XXXX.
The Artefact Evaluation Committee will review the artefacts, assessing whether they are functional (can be run) and whether the central results of the paper can be reproduced. The reviewers will interact with the authors to suggest how to improve the reproducibility if any issues are discovered.
Per artefact, authors and reviewers will be asked whether they would like their interactions on the artefact reviewing to be:
* **closed** Carried out through a closed interface with reviewers anonymous;
* **open** Carried out through GitHub/GitLab issues with reviewers' identities known.
If both parties agree to **open**, the interaction can take place in the open, e.g. with reviewers raising tickets/issues when they encounter problems, or even contributing a PR if there is a fix. If any party chooses **closed** then interactions between reviewers and authors will take place via an anonymous, closed interface, but interaction is still encouraged. Either way, this should be seen as a collaborative activity that brings benefits to both the authors and the wider community.
Reviewing will take place over 6 weeks.
# Evaluation Criteria and Outcomes
[Dominic/Marion/Roly to provide checklists for each of these]
## Available
[TODO]
## Functional
[TODO]
## Reproducible
[TODO]
## Addendum
Following artefact review, an addendum will be published alongside your submission that captures the criteria met during artefact review.
## Versioning
[DOIs, specific versions]
# Guidelines for Reviewers
Authors should be encouraged to push improvements through the artefact whenever possible. For example, if essential documentation is missing, then rather than just providing the reviewers with the required information, the authors should provide that information via an updated artefact.
## Expected workload
- 16 submissions, 2 reviewers per AE submission, 2 submissions per reviewer (so AEC of up to 16 reviewers)
- 2-hour on-boarding session to get everyone up to speed
# Support
## What authors should provide in their artefact submission
## At CI2024
There will be a 1 hour session for authors who are considering submitting artefacts to give an overview of the approach, some lightning talks on producing reproducible artefacts, discussion of the criteria, and an opportunity to ask questions about the process.
## During the review process
- ReproHack checklist/feedback?
# Submission Requirements
# Benefits
# Review Committee
## Infrastructure
- Laptops, cloud-ready environment?, HPC?
## Roles
- Artefact Evaluation Committee
- Authors
- Reviewers
- CUP
## Retrospective report