(cli-page)=
Nextflow provides a robust command line interface (CLI) for the management and execution of pipelines. This page explains the key concepts and common usage patterns for the CLI.
For the complete reference of all commands, subcommands, and options, see {ref}cli-reference.
:::{note} Nextflow uses two types of command line flags:
- Nextflow options use a single dash (e.g.,
-log) and modify Nextflow's behavior. - Pipeline parameters use a double dash (e.g.,
--input) and are passed to your pipeline script. :::
Pipeline execution is the core function of Nextflow. These commands run Nextflow workflows, either from local files or remote Git repositories. Nextflow handles downloading, caching, and executing pipelines with minimal user intervention.
The run command executes pipeline scripts from local files or remote repositories. It automatically manages repository downloads, caching, and execution, supporting various Git providers and authentication methods.
See {ref}cli-run for more information.
Local pipelines
Run a pipeline from your local filesystem:
$ nextflow run main.nfRemote pipelines
Use the format <organization>/<repository> to run a pipeline directly from Git repositories:
$ nextflow run nextflow-io/helloNextflow automatically:
- Downloads the repository to
$HOME/.nextflow/assets/ - Caches it for future runs
- Executes the main script
If you omit the organization, Nextflow searches cached pipelines first, then attempts to download from the NXF_ORG organization (default: nextflow-io).
You can also use full repository URLs:
$ nextflow run https://github.com/nextflow-io/helloPrivate repositories
Use the -user option to add credentials for private repositories:
$ nextflow run organization/private-repo -user my-usernameAlternatively, configure Git authentication. See {ref}Git configuration <git-page> for more information.
Non-GitHub providers
Use the -hub option specify Bitbucket, GitLab, or other Git providers:
$ nextflow run organization/repo -hub bitbucketRevision selection
Use the -r option to specify Git branches, tags, or commits:
$ nextflow run nextflow-io/hello -r v1.1
$ nextflow run nextflow-io/hello -r dev-branch
$ nextflow run nextflow-io/hello -r a3f5c8e:::{versionadded} 25.12.0-edge
:::
Nextflow downloads and stores each explicitly requested Git branch, tag, or commit ID in a separate directory path, enabling you to run multiple revisions of the same pipeline simultaneously. Downloaded revisions are stored in a subdirectory of the local project: $NXF_ASSETS/.repos/<org>/<repo>/clones/<commitId>.
:::{tip} Use tags or commit IDs instead of branches for reproducible pipeline runs. Branch references change as development progresses over time. :::
(cli-params)=
Pipeline parameters are values defined with params in your script. Override them on the command line using the -- prefix to customize pipeline behavior without modifying code.
$ nextflow run main.nf --input data.csv --output resultsParameter names support automatic conversion between kebab-case and camelCase:
$ nextflow run main.nf --input-file data.csv # Becomes params.inputFileParameters without values are set to true:
$ nextflow run main.nf --verbose # params.verbose = true:::{warning} Quote parameters containing wildcards to prevent shell expansion:
$ nextflow run main.nf --files "*.fasta":::
Parameter files
For complex parameter sets, use YAML or JSON files with -params-file. This is cleaner than long command lines.
{
"input": "data.csv",
"output": "results/",
"min_quality": 20
}$ nextflow run main.nf -params-file params.jsonParameter precedence
Nextflow applies parameters defined in multiple places in the following order (lowest to highest priority):
- Script parameters (
params.foo = 'default') - Configuration parameters (see {ref}
config-params) - Parameter files (
-params-file) - Command line parameters (
--foo bar)
The kuberun command executes pipelines entirely within a Kubernetes cluster. This experimental feature runs the Nextflow driver itself inside Kubernetes.
Use this when you want both the Nextflow driver and tasks running in Kubernetes. This differs from using the Kubernetes executor with run, where only tasks run in Kubernetes while the driver runs externally.
$ nextflow kuberun nextflow-io/helloSee {ref}cli-kuberun for more information.
Project management commands interact with Git-hosted pipelines. Nextflow integrates with Git providers (e.g., GitHub, GitLab, and Bitbucket) to treat pipelines as versioned projects and maintains a local cache in $HOME/.nextflow/assets/.
Use these commands to explore available pipelines, inspect their code, maintain your cache, and clone projects.
The list command shows all pipelines currently downloaded to your local cache. This helps you track which projects are available offline and manage your cache directory. Pipelines are stored in $HOME/.nextflow/assets/ by default.
$ nextflow listThe info command displays detailed metadata about a downloaded project, including its repository location, local path, main script, and available revisions.
Use this to understand a project's structure, see available versions, or verify which revision is currently checked out.
$ nextflow info hello
project name: nextflow-io/hello
repository : https://github.com/nextflow-io/hello
local path : $HOME/.nextflow/assets/.repos/nextflow-io/hello
main script : main.nf
revisions :
> master (default)
mybranch
> v1.1 [t]
v1.2 [t]This shows:
- The full project name and repository URL
- Where it's cached locally
- Which script runs by default
- Available revisions (branches and tags marked with
[t]) - Which revisions are currently checked out (marked with
>)
The pull command downloads a pipeline or updates an existing one to the latest version from its Git repository.
Use this to manually download pipelines before running them, update cached pipelines, or download pipelines for offline use.
$ nextflow pull nextflow-io/helloYou can specify a particular revision to download.
$ nextflow pull nextflow-io/hello -r mybranchSee {ref}cli-pull for more information.
The view command displays the contents of a pipeline's main script or lists all files in the repository.
Use this to quickly inspect pipeline code without opening files or explore the project structure. Specify -l option lists all repository files instead of showing script contents.
$ nextflow view nextflow-io/hello
$ nextflow view nextflow-io/hello -lSee {ref}cli-view for more information.
The clone command copies a pipeline from the cache to a local directory and creates a full Git repository you can modify.
Use this when you want to modify an existing pipeline, create a derivative pipeline, or study a pipeline's structure. If you omit the target directory it uses the pipeline name.
$ nextflow clone nextflow-io/hello my-helloSee {ref}cli-clone for more information.
The drop command removes a downloaded pipeline from the local cache.
Use this to free disk space by removing pipelines you no longer need. The project is deleted from $HOME/.nextflow/assets/. The next run will download it again if needed.
$ nextflow drop nextflow-io/helloSee {ref}cli-drop for more information.
The secrets command manages secure pipeline secrets.
Use this to store credentials securely, reference them in pipelines without exposing values, and manage sensitive data centrally across your organization.
$ nextflow secrets list
$ nextflow secrets set AWS_ACCESS_KEY_ID
$ nextflow secrets delete AWS_ACCESS_KEY_IDSee {ref}cli-secrets for more information.
:::{versionadded} 26.04.0 :::
Module management commands enable working with reusable, registry-based modules. The Nextflow module system allows you to install, run, search, and publish standardized modules from registries, eliminating duplicate work and spreading improvements throughout the community.
Use these commands to discover modules in registries, install them into your project, run them directly without creating a workflow, and publish your own modules for others to use.
The module install command downloads modules from a registry and makes them available in your workflow. Modules are stored locally in the modules/ directory and version information is tracked in nextflow_spec.json.
Use this to add reusable modules to your pipeline, manage module versions, or update modules to newer versions.
$ nextflow module install nf-core/fastqc
$ nextflow module install nf-core/fastqc -version 1.0.0After installation, module will be available in modules/@nf-core/fastqc and included in nextflow_spec.json
Use the -force flag to reinstall a module even if local modifications exist.
See {ref}cli-module-install for more information.
The module run command executes a module directly from the registry without requiring a wrapper workflow. This provides immediate access to module functionality for ad-hoc tasks or testing.
Use this to quickly run a module, test module functionality, or execute one-off data processing tasks.
$ nextflow module run nf-core/fastqc --input 'data/*.fastq.gz'
$ nextflow module run nf-core/fastqc --input 'data/*.fastq.gz' -version 1.0.0The command accepts all standard Nextflow execution options (-profile, -resume, etc.):
$ nextflow module run nf-core/salmon \
--reads reads.fq \
--index salmon_index \
-profile docker \
-resumeSee {ref}cli-module-run for more information.
The module list command displays all modules currently installed in your project, showing their versions and integrity status.
Use this to review installed modules, check module versions, or detect local modifications.
$ nextflow module list
$ nextflow module list -jsonThe output shows each module's name, installed version, and whether it has been modified locally. Use -json for machine-readable output suitable for scripting.
See {ref}cli-module-list for more information.
The module search command queries the module registry to discover available modules by keyword or name.
Use this to find modules for specific tasks, explore available tools, or discover community contributions.
$ nextflow module search alignment
$ nextflow module search "quality control" -limit 10
$ nextflow module search bwa -jsonResults include module names, versions, descriptions, and download statistics. Use -limit to control the number of results and -json for programmatic access.
See {ref}cli-module-search for more information.
The module info command displays detailed metadata and usage information for a specific module from the registry.
Use this to understand module requirements, view input/output specifications, see available tools, or generate usage templates before installing or running a module.
$ nextflow module info nf-core/fastqc
$ nextflow module info nf-core/fastqc -version 1.0.0
$ nextflow module info nf-core/fastqc -jsonThe output includes the module's version, description, authors, keywords, tools, input/output channels, and a generated usage template showing how to run the module. Use -json for machine-readable output suitable for programmatic access.
See {ref}cli-module-info for more information.
The module remove command deletes modules from your project, removing local files and configuration entries.
Use this to clean up unused modules, free disk space, or remove deprecated modules from your pipeline.
$ nextflow module remove nf-core/fastqc
$ nextflow module remove nf-core/fastqc -keep-config
$ nextflow module remove nf-core/fastqc -keep-filesBy default, both local files and configuration entries are removed. Use -keep-config to preserve version information in nextflow_spec.json, or -keep-files to remove only the configuration entry while keeping local files.
See {ref}cli-module-remove for more information.
The module publish command uploads modules to a registry, making them available for others to install and use.
Use this to share your modules with the community, contribute to module libraries, or distribute modules within your organization.
$ nextflow module publish myorg/my-module
$ nextflow module publish myorg/my-module -dry-runPublishing requires authentication via the NXF_REGISTRY_TOKEN environment variable or registry.apiKey in the Nextflow configuration. The module must include main.nf, meta.yaml, and README.md files.
Use -dry-run to validate your module structure without uploading.
See {ref}cli-module-publish for more information.
Configuration and validation options and commands help you control and verify pipeline settings. Configuration options supplement pipeline configuration at runtime, while validation commands inspect how Nextflow interprets your configuration files, process definitions, and scripts.
Use these to customize pipeline configuration, debug configuration issues, verify settings, and catch issues before execution.
The -c option adds your configuration on top of the defaults and merges them together.
Use this when you want to override specific settings while keeping other defaults intact. Multiple configuration files can be specified as a comma separated list.
$ nextflow -c my.config run nextflow-io/helloSee {ref}config-page for more information.
The -C option replaces all default configuration with your custom configuration files. Multiple configuration files can be specified as a comma separated list.
Use this when you want to ensure no default configurations interfere with your custom settings. Unlike -c which merges configurations, -C ensures only your specified file is used.
$ nextflow -C my.config run nextflow-io/helloSee {ref}config-page for more information.
The config command prints the resolved configuration for a pipeline.
Use this to debug configuration issues, verify which settings will be applied, understand configuration precedence, or inspect specific configuration properties.
$ nextflow config
$ nextflow config nextflow-io/helloSee {ref}cli-config for more information.
:::{versionadded} 23.10.0 :::
The inspect command analyzes process settings in a pipeline without executing it. It outputs container information in JSON or Nextflow configuration format.
Use this to determine which container images will be used by each process before running the pipeline.
$ nextflow inspect nextflow-io/hello
$ nextflow inspect nextflow-io/hello -format jsonSee {ref}cli-inspect for more information.
The lint command analyzes Nextflow scripts and configuration files for syntax errors and code issues. It can also automatically format your code to maintain consistent style across your project.
Use this to catch syntax errors before execution, enforce consistent code formatting, or validate entire directories of Nextflow code.
$ nextflow lint main.nfSee {ref}cli-lint for more information.
Execution history and maintenance commands manage past runs and clean up cached files. Nextflow maintains metadata about all executions and stores intermediate files in work directories.
Use these commands to review past executions, free disk space, troubleshoot failures, or explore data lineage.
The log command displays execution history and details about past pipeline runs, such as run names, timestamps, and customizable output fields.
Use this to find run names for resuming, review execution history, or debug failed runs. The command shows recent executions by default, with options to view specific runs or customize output fields.
$ nextflow log
$ nextflow log dreamy_euler
$ nextflow log last -f name,status,durationSee {ref}cli-log for more information.
The clean command removes work directories and cached intermediate files from past executions.
Use this to free disk space, clean up failed or test runs, or maintain your work directory. Use -n to perform a dry run and show what would be deleted. Use -f to delete files.
$ nextflow clean -n
$ nextflow clean dreamy_euler -fSee {ref}cli-clean for more information.
:::{versionadded} 25.04.0 :::
:::{warning} Experimental: may change in a future release. :::
The lineage command explores data lineage and provenance for workflow executions and tracks relationships between inputs, outputs, and processing steps.
Use this to understand input/output relationships between tasks, trace data flow through the pipeline, or establish file provenance. Lineage tracking must be enabled in configuration.
$ nextflow lineageSee {ref}data-lineage-page to get started and {ref}cli-lineage for more information.
Seqera Platform is a comprehensive workflow orchestration platform that extends Nextflow with features for workflow management, monitoring, and collaboration.
Use these commands to authenticate with Seqera Platform and launch workflows directly to the Platform's managed infrastructure.
:::{versionadded} 25.10.0 :::
The auth command manages authentication credentials for Seqera Platform, saving access tokens for API interactions.
Use this to log in or out of the Platform, establishing or removing your authentication credentials.
$ nextflow auth loginAdditional authentication operations include checking login status, viewing configuration details, and logging out:
$ nextflow auth status
$ nextflow auth config
$ nextflow auth logoutSee {ref}cli-auth for more information.
:::{versionadded} 25.10.0 :::
The launch command submits a workflow to run on Seqera Platform's infrastructure instead of your local machine.
Use this to leverage Platform's cloud resources, monitoring capabilities, and execution management. The Platform handles resource provisioning, execution monitoring, and result storage.
$ nextflow launch nextflow-io/helloSee {ref}cli-launch for more information.
System utilities provide administrative and development tools for managing Nextflow itself, interacting with remote filesystems, working with plugins, and debugging.
Use these commands for system administration, development, and testing.
The console command launches an interactive Groovy console with Nextflow's execution context loaded.
Use this to test Nextflow DSL code interactively, debug expressions, explore Nextflow's APIs, or experiment with syntax. It opens a GUI or REPL depending on your environment.
$ nextflow consoleThe fs command performs filesystem operations on remote storage systems supported by Nextflow, such as S3, Google Cloud Storage, and Azure Blob Storage.
Use this to manage remote data, test cloud storage access, or perform bulk file operations without additional tools.
$ nextflow fs list s3://my-bucket/data/
$ nextflow fs cat s3://my-bucket/data/file.txt
$ nextflow fs cp s3://my-bucket/data/file.txt s3://dest/
$ nextflow fs delete s3://my-bucket/data/file.txtSee {ref}cli-fs for more information.
The plugin command creates plugins, installs them, and executes plugin-specific operations.
See {ref}cli-plugin for more information.
Plugin creation
:::{versionadded} 25.04.0 :::
Use the create subcommand to create a new plugin scaffold for development:
$ nextflow plugin createPlugin installation
Use the install subcommand to install a plugin and extend Nextflow functionality:
$ nextflow plugin install my-pluginPlugin execution
Use the the format plugin-name:command to execute plugin-specific commands:
$ nextflow plugin my-plugin:hello --alpha --betaSee individual plugin documentation for plugin specific commands.
The self-update command updates Nextflow to a newer version. It downloads and installs the latest release or a specific version.
Use this to upgrade Nextflow, switch versions, or install edge releases. By default, it updates to the latest stable release. Specify a particular version or use the NXF_EDGE environment variable for development releases.
$ nextflow self-update
$ NXF_EDGE=1 nextflow self-updateThe help command displays detailed usage information for any Nextflow command.
Use this to learn about command-specific options, refresh your memory about syntax, or discover available features for a particular command.
$ nextflow help runThe -v and -version options print Nextflow version information.
Use -v for minimal output showing version and build number.
$ nextflow -v
nextflow version 24.04.0.5917Use -version for detailed output showing creation date, citation, and website.
$ nextflow -version
N E X T F L O W
version 24.04.0 build 5917
created 03-05-2024 15:07 UTC
cite doi:10.1038/nbt.3820
http://nextflow.io