Skip to content

Commit c77c586

Browse files
author
bpinsard
committed
Merge branch 'master' into spm_interfaces
* master: (138 commits) fix: CompCor PCA extraction was erroneous. fixed to save the correct components. fix: disabled check for current version due to portalocker issues doc: changed class to ref doc: minor spell checkings etc while reading fix: job submission should use info['node'] to get node fix: afni volreg to use gen_fname fix: spm new segment only outputs files as .nii fix: simplified format arg since trait catches errors fix: remove hash value computation on run unless requested explicitly by local_hash_check doc: pep8 fmrirealign fix: examples fix: FmriRealign4d fix: nitime test failure when display is missing fix: tests fixed doctest error with comma after -i in WIMT doctest files BF -- changed DARTELNorm2MNInputSpec.fwhm to match SmoothInputSpec.fwhm doc: added intro slides BUG: Explicitly declare arg to fix position ENH: Add opt for converting siemans PAR/REC files ...
2 parents 9ff4241 + 1e5f7b1 commit c77c586

File tree

99 files changed

+4361
-3672
lines changed

Some content is hidden

Large Commits have some content hidden by default. Use the searchbox below for content that may be hidden.

99 files changed

+4361
-3672
lines changed

CHANGES

Lines changed: 5 additions & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -1,7 +1,11 @@
11
Since last release
22
==================
33

4-
* ENH: New interfaces: MySQLSink, nipy.Similarity, WatershedBEM
4+
* API: display variable no longer encoded as inputs in commandline interfaces
5+
6+
* ENH: New interfaces: MySQLSink, nipy.Similarity, WatershedBEM,
7+
NetworkBasedStatistic, Atropos, N4BiasFieldCorrection, ApplyTransforms,
8+
fs.MakeAverageSubject
59

610
* FIX: Afni outputs should inherit from TraitedSpec
711

build_docs.py

Lines changed: 1 addition & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -21,7 +21,7 @@
2121
# Sphinx import.
2222
from sphinx.setup_command import BuildDoc
2323

24-
_info_fname = pjoin('nipype', 'info.py')
24+
_info_fname = pjoin(os.path.dirname(__file__), 'nipype', 'info.py')
2525
INFO_VARS = {}
2626
exec(open(_info_fname, 'rt').read(), {}, INFO_VARS)
2727

doc/quickstart.rst

Lines changed: 3 additions & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -15,8 +15,10 @@ Downloading and installing
1515
Beginner's guide
1616
================
1717

18-
By Michael Notter. `Available here`__
18+
Introductory slides. `Available here`__
19+
Michael Notter's guide. `Available here`__
1920

21+
__ http://satra.github.com/intro2nipype
2022
__ http://miykael.github.com/nipype-beginner-s-guide/index.html
2123

2224
User guides

doc/users/caching_tutorial.rst

Lines changed: 4 additions & 4 deletions
Original file line numberDiff line numberDiff line change
@@ -130,18 +130,18 @@ rather than workflows. Use it: instead of data grabber nodes, use for
130130
instance the `glob` module. To vary parameters, use `for` loops. To make
131131
reusable code, write Python functions.
132132

133-
One good rule of thumb to respect is to avoid the usage of explicite
134-
filenames appart from the outermost inputs and outputs of your
133+
One good rule of thumb to respect is to avoid the usage of explicit
134+
filenames apart from the outermost inputs and outputs of your
135135
processing. The reason being that the caching mechanism of
136136
:mod:`nipy.caching` takes care of generating the unique hashes, ensuring
137-
that, when you vary parameters, files are not overriden by the output of
137+
that, when you vary parameters, files are not overridden by the output of
138138
different computations.
139139

140140
.. topic:: Debuging
141141

142142
If you need to inspect the running environment of the nodes, it may
143143
be useful to know where they were executed. With `nipype.caching`,
144-
you do not control this location as it it encoded by hashes.
144+
you do not control this location as it is encoded by hashes.
145145

146146
To find out where an operation has been persisted, simply look in
147147
it's output variable::

doc/users/config_file.rst

Lines changed: 3 additions & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -80,7 +80,9 @@ Execution
8080
*remove_unnecessary_outputs*
8181
This will remove any interface outputs not needed by the workflow. If the
8282
required outputs from a node changes, rerunning the workflow will rerun the
83-
node. (possible values: ``true`` and ``false``; default value: ``true``)
83+
node. Outputs of leaf nodes (nodes whose outputs are not connected to any
84+
other nodes) will never be deleted independent of this parameter. (possible
85+
values: ``true`` and ``false``; default value: ``true``)
8486

8587
*use_relative_paths*
8688
Should the paths stored in results (and used to look for inputs)

doc/users/debug.rst

Lines changed: 6 additions & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -17,11 +17,16 @@ performance issues.
1717

1818
#. Use the debug config mode. This can be done by setting::
1919

20-
import config
20+
from nipype import config
2121
config.enable_debug_mode()
2222

2323
as the first import of your nipype script.
2424

25+
.. note::
26+
27+
Turning on debug will rerun your workflows and will rerun them after debugging
28+
is turned off.
29+
2530
#. There are several configuration options that can help with debugging. See
2631
:ref:`config_file` for more details::
2732

doc/users/function_interface.rst

Lines changed: 4 additions & 4 deletions
Original file line numberDiff line numberDiff line change
@@ -48,7 +48,7 @@ Which would print ``4``.
4848

4949
Note that, if you are working interactively, the Function interface is
5050
unable to use functions that are defined within your interpreter session.
51-
(Specifcally, it can't use functions that live in the ``__main__`` namespace).
51+
(Specifically, it can't use functions that live in the ``__main__`` namespace).
5252

5353
Using External Packages
5454
-----------------------
@@ -59,7 +59,7 @@ geared towards neuroimaging, such as Nibabel_, Nipy_, or PyMVPA_.
5959

6060
While this is completely possible (and, indeed, an intended use of the
6161
Function interface), it does come with one important constraint. The
62-
function code you write is excecuted in a standalone environment,
62+
function code you write is executed in a standalone environment,
6363
which means that any external functions or classes you use have to
6464
be imported within the function itself::
6565

@@ -77,7 +77,7 @@ Hello World - Function interface in a workflow
7777
Contributed by: Hänel Nikolaus Valentin
7878

7979
The following snippet of code demonstrates the use of the function interface in
80-
the context of a workflow. Note the use of `import os` within the function as
80+
the context of a workflow. Note the use of ``import os`` within the function as
8181
well as returning the absolute path from the Hello function. The `import` inside
8282
is necessary because functions are coded as strings and do not have to be on the
8383
PYTHONPATH. However any function called by this function has to be available on
@@ -139,7 +139,7 @@ the string would be
139139
add_two_str = "def add_two(val):\n return val + 2\n"
140140

141141
Unlike when using a function object, this input can be set like any other,
142-
meaning that you could write a function that outputs differnet function
142+
meaning that you could write a function that outputs different function
143143
strings depending on some run-time contingencies, and connect that output
144144
the the ``function_str`` input of a downstream Function interface.
145145

doc/users/grabbing_and_sinking.rst

Lines changed: 7 additions & 7 deletions
Original file line numberDiff line numberDiff line change
@@ -37,7 +37,7 @@ to simply iterate over subjects.
3737

3838
However, in the context of complex workflows and given that users typically
3939
arrange their imaging and other data in a semantically hierarchical data store,
40-
an alternate mechanism for reading and writing the data generated by a workflow
40+
an alternative mechanism for reading and writing the data generated by a workflow
4141
is often necessary. As the names suggest DataGrabber is used to get at data
4242
stored in a shared file system while DataSink is used to store the data
4343
generated by a workflow into a hierarchical structure on disk.
@@ -46,8 +46,8 @@ generated by a workflow into a hierarchical structure on disk.
4646
DataGrabber
4747
===========
4848

49-
Datagrabber is an interface for collecting files from hard drive. It is very
50-
flexible and supports almost any file organisation of your data you can imagine.
49+
DataGrabber is an interface for collecting files from hard drive. It is very
50+
flexible and supports almost any file organization of your data you can imagine.
5151

5252
You can use it as a trivial use case of getting a fixed file. By default,
5353
DataGrabber stores its outputs in a field called outfiles.
@@ -60,7 +60,7 @@ DataGrabber stores its outputs in a field called outfiles.
6060
datasource1.inputs.template = 'data/s1/f3.nii'
6161
results = datasource1.run()
6262

63-
Or you can get at all uncompressed nifti files starting with the letter 'f' in
63+
Or you can get at all uncompressed NIfTI files starting with the letter 'f' in
6464
all directories starting with the letter 's'.
6565

6666
::
@@ -75,7 +75,7 @@ path matches of the form `/mass/data/s*/f*`.
7575

7676
.. note::
7777

78-
When used with wildcards (e.g., s* and f* above) Datagrabber does not return
78+
When used with wildcards (e.g., s* and f* above) DataGrabber does not return
7979
data in sorted order. In order to force it to return data in sorted order, one
8080
needs to set the input `sorted = True`. However, when explicitly specifying an
8181
order as we will see below, `sorted` should be set to `False`.
@@ -108,7 +108,7 @@ A more realistic use-case
108108

109109
In a typical study one often wants to grab different files for a given subject
110110
and store them in semantically meaningful outputs. In the following example, we
111-
wish to retrieve all the functional runs and the structural image for subject 's1'.
111+
wish to retrieve all the functional runs and the structural image for the subject 's1'.
112112

113113
::
114114

@@ -147,7 +147,7 @@ iterables that have been used in the workflow. This makes navigating the working
147147
directory a not so pleasant experience. And typically the user is interested in
148148
preserving only a small percentage of these outputs. The DataSink interface can
149149
be used to extract components from this `cache` and store it at a different
150-
location. For XNAT-based storage, see XNATSink.
150+
location. For XNAT-based storage, see :ref:`nipype.interfaces.io.XNATSink` .
151151

152152
.. note::
153153

doc/users/install.rst

Lines changed: 2 additions & 2 deletions
Original file line numberDiff line numberDiff line change
@@ -39,13 +39,13 @@ Debian and Ubuntu
3939
~~~~~~~~~~~~~~~~~
4040

4141
Add the `NeuroDebian <http://neuro.debian.org>`_ repository and install
42-
the ``python-nipype`` package using ``apt-get`` or your favourite package
42+
the ``python-nipype`` package using ``apt-get`` or your favorite package
4343
manager.
4444

4545
Mac OS X
4646
~~~~~~~~
4747

48-
The easiest way to get nipype running on MacOSX is to install EPD_ and then add
48+
The easiest way to get nipype running on Mac OS X is to install EPD_ and then add
4949
nibabel and nipype by executing::
5050

5151
easy_install nibabel

examples/fmri_fsl_reuse.py

Lines changed: 0 additions & 9 deletions
Original file line numberDiff line numberDiff line change
@@ -17,15 +17,6 @@
1717

1818
import os # system functions
1919

20-
"""
21-
.. note::
22-
config for logging should be set before anything else
23-
"""
24-
25-
from nipype.utils.config import config
26-
config.set('logging', 'log_to_file', 'false')
27-
config.set_log_dir(os.getcwd())
28-
2920
import nipype.interfaces.io as nio # Data i/o
3021
import nipype.interfaces.fsl as fsl # fsl
3122
import nipype.interfaces.utility as util # utility

0 commit comments

Comments
 (0)