Skip to content

Conversation

@eduard-watchmaker
Copy link
Contributor

PR checklist

Closes #XXX

  • This comment contains a description of changes (with reason).
  • If you've fixed a bug or added code that should be tested, add tests!
  • If you've added a new tool - have you followed the module conventions in the contribution docs
  • If necessary, include test data in your PR.
  • Remove all TODO statements.
  • Emit the versions.yml file.
  • Follow the naming conventions.
  • Follow the parameters requirements.
  • Follow the input/output options guidelines.
  • Add a resource label
  • Use BioConda and BioContainers if possible to fulfil software requirements.
  • Ensure that the test works with either Docker / Singularity. Conda CI tests can be quite flaky:
    • For modules:
      • nf-core modules test <MODULE> --profile docker
      • nf-core modules test <MODULE> --profile singularity
      • nf-core modules test <MODULE> --profile conda
    • For subworkflows:
      • nf-core subworkflows test <SUBWORKFLOW> --profile docker
      • nf-core subworkflows test <SUBWORKFLOW> --profile singularity
      • nf-core subworkflows test <SUBWORKFLOW> --profile conda

@eduard-watchmaker eduard-watchmaker requested review from a team as code owners October 30, 2025 20:50
Copy link
Contributor

@Joon-Klaps Joon-Klaps left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

The changes in nf-test.config is something that needs to be double checked.

@eduard-watchmaker
Copy link
Contributor Author

The changes in nf-test.config is something that needs to be double checked.

Removed and added to specific subworkflow tests directives!

Copy link
Contributor

@Joon-Klaps Joon-Klaps left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Sorry, quite a few things I should have spotted in the first round.

Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Shouldn't touch this file, undo any edits here please

Comment on lines +18 to +41
run("BWA_INDEX") {
script "../../../../modules/nf-core/bwa/index/main.nf"
process {
"""
input[0] = Channel.of([
[ id:'test' ], // meta map
file('https://github.com/nf-core/test-datasets/raw/methylseq/reference/genome.fa', checkIfExists: true)
])
"""
}
}

run("BWA_INDEX", alias: 'BWA_INDEX_PE') {
script "../../../../modules/nf-core/bwa/index/main.nf"
process {
"""
input[0] = Channel.of([
[ id:'test' ], // meta map
file('https://github.com/nf-core/test-datasets/raw/methylseq/reference/genome.fa', checkIfExists: true)
])
"""
}
}
}
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

If it's the same genome why do we need to run the same indexing twice? We can just have a single global setup.

Comment on lines +45 to +48
params {
use_gpu = true
bwa_prefix = 'genome.fa'
}
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Why did you include this? It shouldn't be necessary for the subworkflow, everything is passed as an input. Best to keep it as simple as possible, remove it everywhere

Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Aah, now I see. You got your inspiration from the parabricks/fq2bam module. You can do this but then also need to update the config file, to parse these params in the correct variables.

Given the amount of arguments that there already are in the nextflow.config file, I would just delete this param section.

}
}

test("Params: parabricks/fq2bam single-end | use_gpu ") {
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

This doesn't follow nf-core naming convetion guidelines .

Please also update all other test names

Suggested change
test("Params: parabricks/fq2bam single-end | use_gpu ") {
test("Sarscov2 fasta - SE - deduplicate- with GPU ") {

Comment on lines +64 to +67
input[4] = false // skip_deduplication
input[5] = true // use_gpu
input[6] = Channel.empty() // interval_file
input[7] = Channel.empty() // known_sites
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Comment on lines +8 to +15
tag "gpu"
tag "subworkflows"
tag "subworkflows_nfcore"
tag "subworkflows/fastq_align_dedup_bwamem"
tag "parabricks/fq2bam"
tag "samtools/index"
tag "picard/markduplicates"
tag "untar"
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Think you are missing a few in here, mostly all the samtools ones.

Comment on lines +8 to +15
tag "gpu"
tag "subworkflows"
tag "subworkflows_nfcore"
tag "subworkflows/fastq_align_dedup_bwamem"
tag "parabricks/fq2bam"
tag "samtools/index"
tag "picard/markduplicates"
tag "untar"
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Maybe i missed it, where are you using untar?

Comment on lines +202 to +211
{ assert snapshot(
workflow.out.bam.collect { meta, bamfile -> bam(bamfile).getReadsMD5() },
workflow.out.bai.collect { meta, bai -> file(bai).name },
workflow.out.samtools_flagstat,
workflow.out.samtools_stats,
workflow.out.samtools_index_stats,
workflow.out.picard_metrics.collect { meta, metrics -> file(metrics).name },
workflow.out.multiqc.flatten().collect { path -> file(path).name },
workflow.out.versions
).match() }
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

This is the stub run, we can just leave it as is.

Comment on lines +166 to +169
assertAll(
{ assert workflow.success },
{ assert snapshot(workflow.out).match() }
)
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Suggested change
assertAll(
{ assert workflow.success },
{ assert snapshot(workflow.out).match() }
)
assertAll(
{ assert workflow.success },
{ assert snapshot(
workflow.out,
workflow.out.versions.collect{ path(it).yaml }
).match() }
)

@@ -0,0 +1,160 @@
{
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Yeah none of the tests seems to have ran.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

None yet

Projects

None yet

Development

Successfully merging this pull request may close these issues.

3 participants