Skip to content

Commit da67d71

Browse files
oestebaneffigies
authored andcommitted
fix: typos and links in fmri auditory example
1 parent ff7dd7d commit da67d71

File tree

1 file changed

+29
-24
lines changed

1 file changed

+29
-24
lines changed

examples/fmri_spm_auditory.py

Lines changed: 29 additions & 24 deletions
Original file line numberDiff line numberDiff line change
@@ -33,15 +33,15 @@
3333
3434
"""
3535

36-
# Set the way matlab should be called
36+
# Set the way Matlab should be called
3737
mlab.MatlabCommand.set_default_matlab_cmd("matlab -nodesktop -nosplash")
3838

3939
"""
4040
4141
Setting up workflows
4242
--------------------
43-
In this tutorial we will be setting up a hierarchical workflow for spm
44-
analysis. This will demonstrate how pre-defined workflows can be setup
43+
In this tutorial we will be setting up a hierarchical workflow for SPM
44+
analysis. This will demonstrate how predefined workflows can be setup
4545
and shared across users, projects and labs.
4646
4747
Setup preprocessing workflow
@@ -51,11 +51,11 @@
5151
"""
5252

5353
preproc = pe.Workflow(name='preproc')
54-
"""We strongly encourage to use 4D files insteead of series of 3D for fMRI analyses
54+
"""We strongly encourage to use 4D files instead of series of 3D for fMRI analyses
5555
for many reasons (cleanness and saving and filesystem inodes are among them). However,
5656
the the workflow presented in the SPM8 manual which this tutorial is based on
5757
uses 3D files. Therefore we leave converting to 4D as an option. We are using ``merge_to_4d``
58-
variable, because switching between 3d and 4d requires some additional steps (explauned later on).
58+
variable, because switching between 3D and 4dD requires some additional steps (explained later on).
5959
Use :ref:`nipype.interfaces.fsl.utils.Merge` to merge a series
6060
of 3D files along the time dimension creating a 4D file.
6161
"""
@@ -119,8 +119,8 @@ def get_vox_dims(volume):
119119

120120
"""Here we are connecting all the nodes together.
121121
Notice that we add the merge node only if you choose to use 4D.
122-
Also ``get_vox_dims`` function is passed along the input volume of normalise to set the optimal
123-
voxel sizes.
122+
Also, the ``get_vox_dims`` function is passed along the input volume of
123+
:ref:`nipype.interfaces.spm.preprocess.Normalize` to set the optimal voxel sizes.
124124
"""
125125

126126
if merge_to_4d:
@@ -186,8 +186,8 @@ def get_vox_dims(volume):
186186
('spmT_images', 'stat_image')]),
187187
])
188188
"""
189-
Preproc + Analysis pipeline
190-
---------------------------
189+
Preprocessing and analysis pipeline
190+
-----------------------------------
191191
"""
192192

193193
l1pipeline = pe.Workflow(name='firstlevel')
@@ -196,7 +196,7 @@ def get_vox_dims(volume):
196196
'modelspec.realignment_parameters')])])
197197

198198
"""
199-
Pluging in ``functional_runs`` is a bit more complicated,
199+
Plugging in ``functional_runs`` is a bit more complicated,
200200
because model spec expects a list of ``runs``.
201201
Every run can be a 4D file or a list of 3D files.
202202
Therefore for 3D analysis we need a list of lists and to make one we need a helper function.
@@ -253,10 +253,7 @@ def makelist(item):
253253

254254
"""
255255
Now we create a :ref:`nipype.interfaces.io.DataGrabber`
256-
object and fill in the information from above about the layout of our data. The
257-
:class:`nipype.pipeline.NodeWrapper` module wraps the interface object
258-
and provides additional housekeeping and pipeline specific
259-
functionality.
256+
object and fill in the information from above about the layout of our data.
260257
"""
261258

262259
datasource = pe.Node(
@@ -318,18 +315,26 @@ def makelist(item):
318315
setup the connections between the nodes such that appropriate outputs
319316
from nodes are piped into appropriate inputs of other nodes.
320317
321-
Use the :class:`nipype.pipeline.engine.Pipeline` to create a
322-
graph-based execution pipeline for first level analysis. The config
323-
options tells the pipeline engine to use `workdir` as the disk
324-
location to use when running the processes and keeping their
325-
outputs. The `use_parameterized_dirs` tells the engine to create
326-
sub-directories under `workdir` corresponding to the iterables in the
327-
pipeline. Thus for this pipeline there will be subject specific
328-
sub-directories.
318+
Use the :class:`~nipype.pipeline.engine.workflows.Workflow` to create a
319+
graph-based execution pipeline for first level analysis.
320+
Set the :py:attr:`~nipype.pipeline.engine.workflows.Workflow.base_dir`
321+
option to instruct the pipeline engine to use ``spm_auditory_tutorial/workingdir``
322+
as the filesystem location to use when running the processes and keeping their
323+
outputs.
324+
Other options can be set via `the configuration file
325+
<https://miykael.github.io/nipype_tutorial/notebooks/basic_execution_configuration.html>`__.
326+
For example, ``use_parameterized_dirs`` tells the engine to create
327+
sub-directories under :py:attr:`~nipype.pipeline.engine.workflows.Workflow.base_dir`,
328+
corresponding to the iterables in the pipeline.
329+
Thus, for this pipeline there will be subject specific sub-directories.
330+
331+
When building a workflow, interface objects are wrapped within
332+
a :class:`~nipype.pipeline.engine.nodes.Node` so that they can be inserted
333+
in the workflow.
329334
330335
The :func:`~nipype.pipeline.engine.workflows.Workflow.connect` method creates the
331-
links between the processes, i.e., how data should flow in and out of
332-
the processing nodes.
336+
links between :class:`~nipype.pipeline.engine.nodes.Node` instances, i.e.,
337+
how data should flow in and out of the processing nodes.
333338
"""
334339

335340
level1 = pe.Workflow(name="level1")

0 commit comments

Comments
 (0)