Mixing Singularity and AWS Batch executors #3380
-
I'm attempting to run a hybrid workflow with most processes executing on a local cluster (with Slurm + singularity) and selected processes running on AWS Batch. I can get each to work separately, but when I mix the two executors I'm running into a problem. I'm using Nextflow version 21.10.5 build 5658. Instead of executing a process with the AWS Batch executor, it seems to try to interpret the "container" field as a local Singularity container, instead of as a job definition to send to AWS Batch. For example, with this config snippet (edited slightly to remove real process names) :
I get errors that look like this, as if Nextflow is still trying to run Singularity on our cluster.
Before trying to debug too much, does anyone know if this type of Hybrid execution is supported? Are there ways to mix say Singularity and Docker containers in the same workflow, or customize those settings per process? Since the "singularity.enabled" config looks to be a top level configuration that cannot be customized per process. Any advice or pointers are appreciated. Thank you! |
Beta Was this translation helpful? Give feedback.
Replies: 1 comment 1 reply
-
This should work using the latest version 22.10.1 |
Beta Was this translation helpful? Give feedback.
This should work using the latest version 22.10.1