Skip to content

Commit 42897ec

Browse files
committed
fix: review comments
1 parent a3310c7 commit 42897ec

File tree

3 files changed

+7
-7
lines changed

3 files changed

+7
-7
lines changed

.github/workflows/deploy-apptainer.yml

Lines changed: 1 addition & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -2,7 +2,7 @@ name: Deploy Apptainer
22

33
on:
44
workflow_run:
5-
workflows: ["release-please"]
5+
workflows: ["Release Please"]
66
types:
77
- completed
88
workflow_dispatch:

README.md

Lines changed: 5 additions & 5 deletions
Original file line numberDiff line numberDiff line change
@@ -1,7 +1,7 @@
11
# snakemake-crispr-guides
22

33
[![Snakemake](https://img.shields.io/badge/snakemake-≥8.0.0-green.svg)](https://snakemake.github.io)
4-
[![Github Actions](https://github.com/MPUSP/snakemake-crispr-guides/actions/workflows/snakemake-tests.yml/badge.svg)](https://github.com/MPUSP/snakemake-crispr-guides/actions/workflows/snakemake-tests.yml)
4+
[![GitHub Actions](https://github.com/MPUSP/snakemake-crispr-guides/actions/workflows/snakemake-tests.yml/badge.svg)](https://github.com/MPUSP/snakemake-crispr-guides/actions/workflows/snakemake-tests.yml)
55
[![run with conda](http://img.shields.io/badge/run%20with-conda-3EB049?labelColor=000000&logo=anaconda)](https://docs.conda.io/en/latest/)
66
[![run with singularity](https://img.shields.io/badge/run%20with-singularity-1D355C.svg?labelColor=000000)](https://sylabs.io/docs/)
77
[![workflow catalog](https://img.shields.io/badge/Snakemake%20workflow%20catalog-darkgreen)](https://snakemake.github.io/snakemake-workflow-catalog/docs/workflows/MPUSP/snakemake-crispr-guides.html)
@@ -60,7 +60,7 @@ The workflow is built using [snakemake](https://snakemake.readthedocs.io/en/stab
6060
6. Return report as HTML and PDF files (`weasyprint`)
6161
7. Export module logs and versions
6262

63-
If you want to contribute, report issues, or suggest features, please get in touch on [github](https://github.com/MPUSP/snakemake-crispr-guides).
63+
If you want to contribute, report issues, or suggest features, please get in touch on [GitHub](https://github.com/MPUSP/snakemake-crispr-guides).
6464

6565
## Deployment options
6666

@@ -134,7 +134,7 @@ The list of available on-target scores in the [R crisprScore package](https://gi
134134

135135
Another good reason to exclude some scores are the computational resources they require. Particularly deep learning-derived scores are calculated by machine learning models that require both a lot of extra resources in terms of disk space (downloaded and installed _via_ `basilisk` and `conda` environments) and processing power (orders of magnitude longer computation time).
136136

137-
Users can look up all available scores on the [R crisprScore github page](https://github.com/crisprVerse/crisprScore) and decide which ones should be included. In addition, the default behavior of the pipeline is to compute an average score and select the top N guides based on it. The average score is the _weighted mean_ of all single scores and the `score_weights` can be defined in the `config/config.yml` file. If a score should be excluded from the ranking, it's weight can simply be set to zero.
137+
Users can look up all available scores on the [R crisprScore GitHub page](https://github.com/crisprVerse/crisprScore) and decide which ones should be included. In addition, the default behavior of the pipeline is to compute an average score and select the top N guides based on it. The average score is the _weighted mean_ of all single scores and the `score_weights` can be defined in the `config/config.yml` file. If a score should be excluded from the ranking, it's weight can simply be set to zero.
138138

139139
The default scores are:
140140

@@ -201,7 +201,7 @@ The workflow generates the following output from its modules:
201201
- Dr. Michael Jahn
202202
- Affiliation: [Max-Planck-Unit for the Science of Pathogens](https://www.mpusp.mpg.de/) (MPUSP), Berlin, Germany
203203
- ORCID profile: https://orcid.org/0000-0002-3913-153X
204-
- github page: https://github.com/m-jahn
204+
- GitHub page: https://github.com/m-jahn
205205

206206
## License
207207

@@ -216,7 +216,7 @@ The code in this repository is published with the [MIT](https://choosealicense.c
216216
## Contributions
217217

218218
- Contributions to this package are welcome!
219-
- Please get in touch on github by [filing a new issue with your suggestion](https://github.com/MPUSP/snakemake-crispr-guides/issues)
219+
- Please get in touch on GitHub by [filing a new issue with your suggestion](https://github.com/MPUSP/snakemake-crispr-guides/issues)
220220
- After initial discussion, you are welcome to submit your pull request
221221

222222
## References

config/README.md

Lines changed: 1 addition & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -78,7 +78,7 @@ The list of available on-target scores in the [R crisprScore package](https://gi
7878

7979
Another good reason to exclude some scores are the computational resources they require. Particularly deep learning-derived scores are calculated by machine learning models that require both a lot of extra resources in terms of disk space (downloaded and installed _via_ `basilisk` and `conda` environments) and processing power (orders of magnitude longer computation time).
8080

81-
Users can look up all available scores on the [R crisprScore github page](https://github.com/crisprVerse/crisprScore) and decide which ones should be included. In addition, the default behavior of the pipeline is to compute an average score and select the top N guides based on it. The average score is the _weighted mean_ of all single scores and the `score_weights` can be defined in the `config/config.yml` file. If a score should be excluded from the ranking, it's weight can simply be set to zero.
81+
Users can look up all available scores on the [R crisprScore GitHub page](https://github.com/crisprVerse/crisprScore) and decide which ones should be included. In addition, the default behavior of the pipeline is to compute an average score and select the top N guides based on it. The average score is the _weighted mean_ of all single scores and the `score_weights` can be defined in the `config/config.yml` file. If a score should be excluded from the ranking, it's weight can simply be set to zero.
8282

8383
The default scores are:
8484

0 commit comments

Comments
 (0)