Skip to content

Commit 8868981

Browse files
committed
📝 Update YAML sample for Pipeline Config
1 parent ee194e2 commit 8868981

File tree

1 file changed

+51
-12
lines changed

1 file changed

+51
-12
lines changed

docs/_sources/user/pipelines/pipeline_config.rst

Lines changed: 51 additions & 12 deletions
Original file line numberDiff line numberDiff line change
@@ -53,7 +53,7 @@ If you want to base a pipeline on another pipeline configuration YAML file, you
5353
5454
FROM: /path/to/pipeline.yml
5555
56-
in your pipeline configuration file. You can use the name of a :doc:`preconfigured pipeline </user/preconfig>` instead of a filepath if you want to base a configuration file on a preconfigured pipeline. If ``FROM`` is not specified, the pipeline will be based on :doc:`the default pipeline <default>`.
56+
in your pipeline configuration file. You can use the name of a :doc:`preconfigured pipeline </user/preconfig>` instead of a filepath if you want to base a configuration file on a preconfigured pipeline. If ``FROM`` is not specified, the pipeline will be based on :doc:`the default pipeline </user/pipelines/default>`.
5757

5858
C-PAC will include all expected keys from the pipeline file specified in ``FROM`` (or the default pipeline if none is specified). Any keys specified in a pipeline configuration file will take precedence over the same key in the ``FROM`` base configuration, but all omitted keys will retain their values from the ``FROM`` base configuration.
5959

@@ -67,28 +67,67 @@ If you want to run the analysis from terminal::
6767

6868
cpac run --pipe_config {path to pipeline config} {path to data config}
6969

70-
Pipeline configuration files, like the data settings and data configuration files discussed in the :doc:`data configuration builder section </user/subject_list_config>`, are stored as YAML files. Similarly, each of the parameters used by C-PAC to assemble your pipeline can be specified as key-value pairs, so a pipeline configuration YAML would have multiple lines of the form ``key: value`` like so::
70+
Pipeline configuration files, like the data settings and data configuration files discussed in the :doc:`data configuration builder section </user/subject_list_config>`, are stored as YAML files. Similarly, each of the parameters used by C-PAC to assemble your pipeline can be specified as nested key-value pairs, so a pipeline configuration YAML would have multiple lines of the form ``key: value`` like so
71+
72+
.. code-block:: YAML
73+
74+
pipeline_setup:
7175
7276
# Name for this pipeline configuration - useful for identification.
73-
pipelineName : pipeline01
77+
pipelineName: pipeline01
78+
79+
working_directory:
80+
81+
# Directory where C-PAC should store temporary and intermediate files.
82+
# - This directory must be saved if you wish to re-run your pipeline from where you left off (if not completed).
83+
# - NOTE: As it stores all intermediate files, this directory can grow to become very
84+
# large, especially for data with a large amount of TRs.
85+
# - If running in a container (Singularity/Docker), you can simply set this to an arbitrary
86+
# name like '/work', and then map (-B/-v) your desired output directory to that label.
87+
# - If running outside a container, this should be a full path to a directory.
88+
# - This can be written to '/tmp' if you do not intend to save your working directory.
89+
path: /tmp
90+
91+
# Deletes the contents of the Working Directory after running.
92+
# This saves disk space, but any additional preprocessing or analysis will have to be completely re-run.
93+
remove_working_dir: True
94+
95+
crash_log_directory:
96+
97+
# Directory where CPAC should write crash logs.
98+
path: /crash
99+
100+
log_directory:
101+
102+
# Whether to write log details of the pipeline run to the logging files.
103+
run_logging: True
74104
105+
path: /logs
75106
76-
# Directory where CPAC should store temporary and intermediate files.
77-
workingDirectory : /home/runs/pipeline01/work
107+
output_directory:
78108
109+
# Directory where C-PAC should write out processed data, logs, and crash reports.
110+
# - If running in a container (Singularity/Docker), you can simply set this to an arbitrary
111+
# name like '/output', and then map (-B/-v) your desired output directory to that label.
112+
# - If running outside a container, this should be a full path to a directory.
113+
path: /output
79114
80-
# Directory where CPAC should write crash logs.
81-
crashLogDirectory : /home/runs/pipeline01/crash
115+
# Include extra versions and intermediate steps of functional preprocessing in the output directory.
116+
write_func_outputs: False
82117
118+
# Include extra outputs in the output directory that may be of interest when more information is needed.
119+
write_debugging_outputs: False
83120
84-
# Directory where CPAC should place run logs.
85-
logDirectory : /home/runs/pipeline01/log
121+
# Output directory format and structure.
122+
# Options: default, ndmg
123+
output_tree: "default"
86124
125+
# Generate quality control pages containing preprocessing and derivative outputs.
126+
generate_quality_control_images: True
87127
88-
# Directory where CPAC should place processed data.
89-
outputDirectory : /home/runs/pipeline01/output
128+
An example of a pipeline configuration YAML file can be found :doc:`here </user/pipelines/default>`_. Tables explaining the keys and their potential values can be found on the individual pages for each of the outputs C-PAC is capable of producing. All pipeline setup configuration files should have the keys in the :doc:`Output Settings </user/output_config>` table defined.
90129

91-
An example of a pipeline configuration YAML file can be found `here <https://raw.githubusercontent.com/FCP-INDI/C-PAC/master/CPAC/resources/configs/pipeline_config_template.yml>`_. Tables explaining the keys and their potential values can be found on the individual pages for each of the outputs C-PAC is capable of producing. All pipeline configuration files should have the keys in the :doc:`Output Settings </user/output_config>` table defined.
130+
If ``FROM`` is defined (see above), any undefined keys will be inferred from the pipeline configuration specified; otherwise, any undefined keys will be inferred from the default pipeline.
92131

93132
Why a list?
94133
'''''''''''

0 commit comments

Comments
 (0)