You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
Copy file name to clipboardExpand all lines: get-involved.md
+1-1Lines changed: 1 addition & 1 deletion
Display the source diff
Display the rich diff
Original file line number
Diff line number
Diff line change
@@ -41,4 +41,4 @@ We would be happy to list your journal of conference on our website after we hav
41
41
## As a developer or science communicator
42
42
43
43
CODECHECK is a community effort, and your help is welcome across all tasks that the team faces.
44
-
If you want to write code to assisst codecheckers to conduct more effective reviews or to streamline the CODECHECK review processes, or if you want to help presenting and sharing CODECHECK's vision and educate others on code executability checks - then _please get in touch with the [CODECHECK team](partners/#team) (see email links in footer)_.
44
+
If you want to write code to assisst codecheckers to conduct more effective reviews or to streamline the CODECHECK review processes, or if you want to help presenting and sharing CODECHECK's vision and educate others on code executability checks - then _please get in touch with the [CODECHECK team](/partners/#team) (see email links in footer)_.
Copy file name to clipboardExpand all lines: guide/bundle.md
+1-1Lines changed: 1 addition & 1 deletion
Display the source diff
Display the rich diff
Original file line number
Diff line number
Diff line change
@@ -31,4 +31,4 @@ R Markdown has some nice features that are helpful for writing a report.
31
31
- Literate programming and code chunks (hidden, visible) with nice looking and mostly hastle free PDF output (using [tinytex](https://yihui.name/tinytex/))
32
32
- You can configure document metadata, e.g. the title or subtitle, anywhere in the document, so you can choose to configure them only in the `codecheck.yml`, see [this example](https://github.com/codecheckers/Piccolo-2020/blob/master/codecheck/piccolo2020-codecheck.Rmd)
33
33
- The CODECHECK [assistant](https://github.com/codecheckers/assistant/) is an R package that streamlines report writing with R
34
-
- You can capture the metadata of the computing environment in an automatically generated _colophon_ that lists installed packages and session information (using `sessionInfo()` or `devtools::session_info()`), see for example [this one](https://github.com/benmarwick/rrtools/blob/master/inst/templates/paper.qmd#L119) in the template in`rrtools` by Ben Marwick, or a _reproducibility receipt_, see [this code example](https://github.com/PredictiveEcology/pemisc/blob/cf1516ff3893a7ffbfe1ae6623c0350c47c3e1b2/R/reproducibilityReceipt.R#L20), which adds git repository information and information about external libraries to the session information
34
+
- You can capture the metadata of the computing environment in an automatically generated _colophon_ that lists installed packages and session information (using `sessionInfo()` or `devtools::session_info()`), see for example [this one](https://github.com/benmarwick/rrtools/blob/master/inst/templates/paper.qmd) in the template from`rrtools` by Ben Marwick, or a _reproducibility receipt_, see [this code example](https://github.com/PredictiveEcology/pemisc/blob/cf1516ff3893a7ffbfe1ae6623c0350c47c3e1b2/R/reproducibilityReceipt.R), which adds git repository information and information about external libraries to the session information
Copy file name to clipboardExpand all lines: guide/community-workflow.md
+3-3Lines changed: 3 additions & 3 deletions
Display the source diff
Display the rich diff
Original file line number
Diff line number
Diff line change
@@ -35,7 +35,7 @@ It is worth taking a look around for such checklists for your discipline or meth
35
35
From our experience, **documentation** is the key.
36
36
A typical measure for a good level of documentation is to provide at least so much information as the author would themselves need after a longer period of time, e.g., 1 year, to run the analysis again.
37
37
Any researcher, even if not familiar with the software stack, should be able to run the workflow and find out if the code works.
38
-
Structured information about the computing environment, such as a [_colophon_](https://github.com/benmarwick/rrtools/blob/master/inst/templates/paper.Rmd#L105) or _"reproducibility receipt"_ in computational notebooks (see [this discussion on Twitter](https://twitter.com/MilesMcBain/status/1263272935197782016?s=09)) are very helpful.
38
+
Structured information about the computing environment, such as a _colophon_ or _reproducibility receipt_ are very helpful, see the [CODECHECK bundle guide](/guide/bundle).
39
39
40
40
Common sense shall be applied to decide about the suitable amount of data and to handle big datasets, sensitive datasets with privacy concerns, and long execution times.
41
41
For example, data may be deposited depending on community practices in remote repositories, synthetic data may be used, subsets or preprocessed data may be included, or protected access to information may be provided (e.g. cloud-based data enclaves).
@@ -83,7 +83,7 @@ When your workflow is ready to be CODECHECK, open an issue on the [CODECHECK reg
83
83
84
84
After the publication of the CODECHECK certificate, add a reference to the certificate in your paper, e.g., in a section describing your workflow or in the acknowledgements:
85
85
86
-
> _A CODECHECK certificate is available confirming that [all of the, a (significant) part of the, the] computations underlying this article could be independently executed: <https://doi.org/certificate-DOI>._
86
+
> _A CODECHECK certificate is available confirming that [all of the, a (significant) part of the, the] computations underlying this article could be independently executed: `DOI of the certificate here`.
87
87
88
88
------
89
89
@@ -194,7 +194,7 @@ When a new issue is assigned to a codecheck editor in the register, here are a f
194
194
- Ask the codechecker to add/update all required metadata in the `codecheck.yml` and updated the certificate report (especially the final DOI!), double-check the information in the metadata and the actual certificate; wait until the certificate is published with its own DOI.
195
195
- Trigger a rebuild of the register by adding the CODECHECK to the `register.csv` file; you may add a `closes #N` statement in the commit message to close the isue.
196
196
- Clear up the labels of the register issue - all labels except the [`community`](https://github.com/codecheckers/register/labels/community)/[`journal`](https://github.com/codecheckers/register/labels/journal)/[`conference/workshop`](https://github.com/codecheckers/register/labels/conference%2Fworkshop) should be removed.
197
-
- "Archive" the repository clone in the codecheckers organisation on GitHub/the cdchk organisation on GitLab ([instructions for GitHub](https://docs.github.com/en/github/creating-cloning-and-archiving-repositories/about-archiving-repositories), [instructions for GitLab](https://docs.gitlab.com/ee/user/project/settings/#archiving-a-project))
197
+
- "Archive" the repository clone in the codecheckers organisation on GitHub/the cdchk organisation on GitLab ([instructions for GitHub](https://docs.github.com/en/github/creating-cloning-and-archiving-repositories/about-archiving-repositories), [instructions for GitLab](https://docs.gitlab.com/ee/user/project/working_with_projects.html#archive-a-project))
<p class="text-secondary text-justify"><em>Independent execution of computations underlying research articles.</em></p>
14
14
</div>
15
15
@@ -113,7 +113,7 @@ Learn more about the Reproducible AGILE initiative at <https://reproducible-agil
113
113
The CODECHECK team is grateful about the continued interest from the research community on the topic of evaluating code and workflows as part of scholarly communication and peer review.
114
114
115
115
Stephen gave a talk at the [**2022 Toronto Workshop on Reproducibility**](https://canssiontario.utoronto.ca/event/toronto-workshop-on-reproducibility/) organised by [Rohan Alexander](https://rohanalexander.com/).
116
-
You can find the [slides online](https://sje30.github.io/talks/2022/codecheck22.html#1) and also watch the [**recording on YouTube**](https://www.youtube.com/watch?v=TgDgcqtsFvE) - very worth a look because of the great Q&A at the end!
116
+
You can find the [slides online](https://sje30.github.io/talks/2022/codecheck22.html) and also watch the [**recording on YouTube**](https://www.youtube.com/watch?v=TgDgcqtsFvE) - very worth a look because of the great Q&A at the end!
117
117
118
118
Stephen presented **CODECHECK: An Open Science initiative for the independent execution of computations underlying research articles during peer review to improve reproducibility** ([slides](https://bit.ly/codecheck21)) in May 2021 at the [Reproducibility Tea Southhampton](https://reproducibilitea.org/journal-clubs/#Southampton).
119
119
@@ -157,11 +157,13 @@ A [Nature News article](https://doi.org/10.1038/d41586-020-01685-y) by [Dalmeet
157
157
158
158
### 2019-11 | MUNIN conference presentation
159
159
160
-
Stephen Eglen presented CODECHECK at [The 14th Munin Conference on Scholarly Publishing 2019](https://site.uit.no/muninconf/).
160
+
Stephen Eglen presented CODECHECK at [The 14th Munin Conference on Scholarly Publishing 2019](https://site.uit.no/muninconf/) with the submission "CODECHECK: An open-science initiative to facilitate sharing of computer programs and results presented in scientific publications", see <https://doi.org/10.7557/5.4910>.
161
161
162
-
> Take a look at the [poster](https://septentrio.uit.no/index.php/SCS/article/view/4910/4893) and the [slides](https://septentrio.uit.no/index.php/SCS/article/view/4910/4900), or watch the [video recording](https://mediasite.uit.no/Mediasite/Play/8027873496dc465ebc4b9b3ab0338ad01d?playFrom=1772000).
163
-
>
164
-
> [](https://mediasite.uit.no/Mediasite/Play/8027873496dc465ebc4b9b3ab0338ad01d?playFrom=1772000)
162
+
Take a look at the [poster](https://septentrio.uit.no/index.php/SCS/article/view/4910/4893) and the [slides](https://septentrio.uit.no/index.php/SCS/article/view/4910/4900).
163
+
164
+
<!-- or watch the [video recording](https://mediasite.uit.no/Mediasite/Play/8027873496dc465ebc4b9b3ab0338ad01d?playFrom=1772000). -->
165
+
166
+
> 
On the 30th of May 2024, the CODECHECK-NL team organised its first roadshow event in Delft. The event marks the beginning of a series of four workshops to be conducted across the Netherlands, where we carry out live codechecks with authors and reviewers, while also training a new batch of codecheckers across universities in the Netherlands.
14
14
15
-
The Delft event kicked off in the morning with 20 participants and members of the CODECHECK-NL team. We began with a short introduction to the project and its scope by Frank Ostermann (PI for the project, based at the University of Twente), followed by an introduction to codechecking by Stephen Eglen, one of the founders of CODECHECK, and a computational neuroscientist based at the University of Cambridge. Stephen's presentation (which can be found [here](https://sje30.github.io/talks/2024/codecheck2024-02.html#/title-slide)) introduced the philosophy behind codecheck, and the importance of the concept of “good enough” in facilitating code reproducibility. The introduction was followed by a live demo codecheck, conducted on a project submitted by Filip Surma of Delft University of Technology. Curious to see what a CODECHECK certificate looks like? See Filip's certificate [here](https://zenodo.org/records/11403956).
15
+
The Delft event kicked off in the morning with 20 participants and members of the CODECHECK-NL team. We began with a short introduction to the project and its scope by Frank Ostermann (PI for the project, based at the University of Twente), followed by an introduction to codechecking by Stephen Eglen, one of the founders of CODECHECK, and a computational neuroscientist based at the University of Cambridge. Stephen's presentation (which can be found [here](https://sje30.github.io/talks/2024/codecheck2024-02.html)) introduced the philosophy behind codecheck, and the importance of the concept of “good enough” in facilitating code reproducibility. The introduction was followed by a live demo codecheck, conducted on a project submitted by Filip Surma of Delft University of Technology. Curious to see what a CODECHECK certificate looks like? See Filip's certificate [here](https://zenodo.org/records/11403956).
16
16
17
17
Following lunch and the successful codecheck of Filip's project, in collaboration with the participants, in the afternoon we moved into breakout sessions, codechecking three more projects in smaller groups, with authors also present during the process. Three more successful codechecks later, participants had a much clearer picture of what codechecks are, how the process works, and how easy or difficult it can be to run or reuse someone else's code!
0 commit comments