Skip to content

Commit 8dd8468

Browse files
authored
Merge pull request #10 from aweinstein/enh/duecredit
enh/duecredit
2 parents cc341df + 6fed566 commit 8dd8468

Some content is hidden

Large Commits have some content hidden by default. Use the searchbox below for content that may be hidden.

60 files changed

+754
-248
lines changed

.travis.yml

Lines changed: 2 additions & 3 deletions
Original file line numberDiff line numberDiff line change
@@ -57,8 +57,7 @@ install:
5757
pip install https://github.com/dmsurti/mayavi/archive/4d4aaf315a29d6a86707dd95149e27d9ed2225bf.zip;
5858
pip install -e git+https://github.com/enthought/ets.git#egg=ets;
5959
fi
60-
- pip install -r requirements.txt # finish remaining requirements
61-
- python setup.py install
60+
- pip install -e .
6261
script:
6362
- python -W once:FSL:UserWarning:nipype `which nosetests` --with-doctest --with-cov --cover-package nipype --cov-config .coveragerc --logging-level=DEBUG --verbosity=3
6463
after_success:
@@ -72,4 +71,4 @@ deploy:
7271
tags: true
7372
repo: nipy/nipype
7473
branch: master
75-
distributions: "sdist bdist_wheel"
74+
distributions: "sdist"

CHANGES

Lines changed: 13 additions & 8 deletions
Original file line numberDiff line numberDiff line change
@@ -1,6 +1,11 @@
1-
Release 0.12.0-rc1 (April 20, 2016)
2-
============
3-
1+
Release 0.12.0 (July 12, 2016)
2+
==============================
3+
4+
* ENH: New interface for Bruker to Nifti converter (https://github.com/nipy/nipype/pull/1523)
5+
* FIX: output file naming for FIRST outputs (https://github.com/nipy/nipype/pull/1524)
6+
* ENH: Adds `fslmaths -Tstd` to maths interfaces (https://github.com/nipy/nipype/pull/1518)
7+
* FIX: Selecting "gamma" in FSL Level1Design now does what the name says (https://github.com/nipy/nipype/pull/1500)
8+
* ENH: Added grad_dev input to fsl.dti.bedpostx5 interface(https://github.com/nipy/nipype/pull/1493)
49
* ENH: ResourceMultiProc plugin to support resource allocation (https://github.com/nipy/nipype/pull/1372)
510
* ENH: Added dcm2niix interface (https://github.com/nipy/nipype/pull/1435)
611
* ENH: Add nipype_crash_search command (https://github.com/nipy/nipype/pull/1422)
@@ -36,7 +41,7 @@ Release 0.12.0-rc1 (April 20, 2016)
3641
(https://github.com/nipy/nipype/pull/1460)
3742

3843
Release 0.11.0 (September 15, 2015)
39-
============
44+
===================================
4045

4146
* API: Change how hash values are computed (https://github.com/nipy/nipype/pull/1174)
4247
* ENH: New algorithm: mesh.WarpPoints applies displacements fields to point sets
@@ -122,7 +127,7 @@ Release 0.11.0 (September 15, 2015)
122127
(https://github.com/nipy/nipype/pull/1142)
123128

124129
Release 0.10.0 (October 10, 2014)
125-
============
130+
=================================
126131

127132
* ENH: New miscelaneous interfaces: SplitROIs (mapper), MergeROIs (reducer)
128133
to enable parallel processing of very large images.
@@ -166,19 +171,19 @@ Release 0.10.0 (October 10, 2014)
166171
* FIX: Update for FSL 5.0.7 which deprecated Contrast Manager
167172

168173
Release 0.9.2 (January 31, 2014)
169-
============
174+
================================
170175

171176
* FIX: DataFinder was broken due to a typo
172177
* FIX: Order of DataFinder outputs was not guaranteed, it's human sorted now
173178
* ENH: New interfaces: Vnifti2Image, VtoMat
174179

175180
Release 0.9.1 (December 25, 2013)
176-
============
181+
=================================
177182

178183
* FIX: installation issues
179184

180185
Release 0.9.0 (December 20, 2013)
181-
============
186+
=================================
182187

183188
* ENH: SelectFiles: a streamlined version of DataGrabber
184189
* ENH: new tools for defining workflows: JoinNode, synchronize and itersource

CONTRIBUTING.md

Lines changed: 10 additions & 4 deletions
Original file line numberDiff line numberDiff line change
@@ -10,10 +10,16 @@
1010
* Pull Requests should be tested, if feasible:
1111
- bugfixes should include regression tests
1212
- new behavior should at least get minimal exercise
13-
* Use a descriptive prefix for your PR: ENH, FIX, TST, DOC, STY, REF (refactor), WIP (Work in progress)
14-
* After submiting the PR, include an update to the CHANGES file: prefix: description (URL of pull request)
15-
* `make specs`
16-
* do: `make check-before-commit` before submitting the PR. This will require you to either install or be in developer mode with: `python setup.py install/develop`.
13+
* Use a descriptive prefix for your PR: ENH (enhancement), FIX, TST, DOC, STY, REF (refactor), WIP (Work in progress)
14+
* The person who accepts/merges your PR will include an update to the CHANGES file: prefix: description (URL of pull request)
15+
* Run `make check-before-commit` before submitting the PR.
16+
This will require you to either install or be in developer mode with: `python setup.py install/develop`.
17+
* In general, do not catch exceptions without good reason.
18+
* catching non-fatal exceptions.
19+
Log the exception as a warning.
20+
* adding more information about what may have caused the error.
21+
Raise a new exception using ``raise_from(NewException("message"), oldException)`` from ``future``.
22+
Do not log this, as it creates redundant/confusing logs.
1723

1824
## Contributing issues
1925

Makefile

Lines changed: 4 additions & 3 deletions
Original file line numberDiff line numberDiff line change
@@ -15,8 +15,6 @@ sdist: zipdoc
1515
python setup.py sdist
1616
@echo "Done building source distribution."
1717
# XXX copy documentation.zip to dist directory.
18-
# XXX Somewhere the doc/_build directory is removed and causes
19-
# this script to fail.
2018

2119
egg: zipdoc
2220
@echo "Building egg..."
@@ -45,7 +43,10 @@ clean-build:
4543
clean-ctags:
4644
rm -f tags
4745

48-
clean: clean-build clean-pyc clean-so clean-ctags
46+
clean-doc:
47+
rm -rf doc/_build
48+
49+
clean: clean-build clean-pyc clean-so clean-ctags clean-doc
4950

5051
in: inplace # just a shortcut
5152
inplace:

doc/conf.py

Lines changed: 1 addition & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -82,7 +82,7 @@
8282
# The short X.Y version.
8383
version = nipype.__version__
8484
# The full version, including alpha/beta/rc tags.
85-
release = "0.11.0"
85+
release = "0.12.0"
8686

8787
# The language for content autogenerated by Sphinx. Refer to documentation
8888
# for a list of supported languages.

doc/users/grabbing_and_sinking.rst

Lines changed: 5 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -58,6 +58,7 @@ DataGrabber stores its outputs in a field called outfiles.
5858
datasource1 = nio.DataGrabber()
5959
datasource1.inputs.base_directory = os.getcwd()
6060
datasource1.inputs.template = 'data/s1/f3.nii'
61+
datasource1.inputs.sort_filelist = True
6162
results = datasource1.run()
6263

6364
Or you can get at all uncompressed NIfTI files starting with the letter 'f' in
@@ -67,6 +68,7 @@ all directories starting with the letter 's'.
6768

6869
datasource2.inputs.base_directory = '/mass'
6970
datasource2.inputs.template = 'data/s*/f*.nii'
71+
datasource1.inputs.sort_filelist = True
7072

7173
Two special inputs were used in these previous cases. The input `base_directory`
7274
indicates in which directory to search, while the input `template` indicates the
@@ -89,6 +91,7 @@ then used to set the template (see %d in the template).
8991
datasource3 = nio.DataGrabber(infields=['run'])
9092
datasource3.inputs.base_directory = os.getcwd()
9193
datasource3.inputs.template = 'data/s1/f%d.nii'
94+
datasource1.inputs.sort_filelist = True
9295
datasource3.inputs.run = [3, 7]
9396

9497
This will return files `basedir/data/s1/f3.nii` and `basedir/data/s1/f7.nii`. We
@@ -98,6 +101,7 @@ can take this a step further and pair subjects with runs.
98101

99102
datasource4 = nio.DataGrabber(infields=['subject_id', 'run'])
100103
datasource4.inputs.template = 'data/%s/f%d.nii'
104+
datasource1.inputs.sort_filelist = True
101105
datasource4.inputs.run = [3, 7]
102106
datasource4.inputs.subject_id = ['s1', 's3']
103107

@@ -115,6 +119,7 @@ wish to retrieve all the functional runs and the structural image for the subjec
115119
datasource = nio.DataGrabber(infields=['subject_id'], outfields=['func', 'struct'])
116120
datasource.inputs.base_directory = 'data'
117121
datasource.inputs.template = '*'
122+
datasource1.inputs.sort_filelist = True
118123
datasource.inputs.field_template = dict(func='%s/f%d.nii',
119124
struct='%s/struct.nii')
120125
datasource.inputs.template_args = dict(func=[['subject_id', [3,5,7,10]]],

doc/users/install.rst

Lines changed: 22 additions & 42 deletions
Original file line numberDiff line numberDiff line change
@@ -9,10 +9,9 @@ This page covers the necessary steps to install Nipype.
99
Download
1010
--------
1111

12-
Release 0.10.0: [`zip <https://github.com/nipy/nipype/archive/0.10.0.zip>`__ `tar.gz
13-
<https://github.com/nipy/nipype/archive/0.10.0.tar.gz>`__]
12+
Current release: `<https://github.com/nipy/nipype/releases/latest>`_.
1413

15-
Development: [`zip <http://github.com/nipy/nipype/zipball/master>`__ `tar.gz
14+
Development version: [`zip <http://github.com/nipy/nipype/zipball/master>`__ `tar.gz
1615
<http://github.com/nipy/nipype/tarball/master>`__]
1716

1817
`Prior downloads <http://github.com/nipy/nipype/tags>`_
@@ -25,13 +24,14 @@ or::
2524

2625
git clone https://github.com/nipy/nipype.git
2726

27+
Check out the list of nipype's `current dependencies <https://github.com/shoshber/nipype/blob/master/nipype/info.py#L105>`_.
28+
2829
Install
2930
-------
3031

3132
The installation process is similar to other Python packages.
3233

33-
If you already have a Python environment setup that has the dependencies listed
34-
below, you can do::
34+
If you already have a Python environment set up, you can do::
3535

3636
easy_install nipype
3737

@@ -61,8 +61,7 @@ If you downloaded the source distribution named something
6161
like ``nipype-x.y.tar.gz``, then unpack the tarball, change into the
6262
``nipype-x.y`` directory and install nipype using::
6363

64-
pip install -r requirements.txt
65-
python setup.py install
64+
pip install -e .
6665

6766
**Note:** Depending on permissions you may need to use ``sudo``.
6867

@@ -76,8 +75,18 @@ nose_ installed, then do the following::
7675

7776
you can also test with nosetests::
7877

79-
nosetests --with-doctest /software/nipy-repo/masternipype/nipype
80-
--exclude=external --exclude=testing
78+
nosetests --with-doctest <installation filepath>/nipype --exclude=external --exclude=testing
79+
80+
or::
81+
82+
nosetests --with-doctest nipype
83+
84+
A successful test run should complete in a few minutes and end with
85+
something like::
86+
87+
Ran 13053 tests in 126.618s
88+
89+
OK (SKIP=66)
8190

8291
All tests should pass (unless you're missing a dependency). If SUBJECTS_DIR
8392
variable is not set some FreeSurfer related tests will fail. If any tests
@@ -89,9 +98,9 @@ tests::
8998

9099
export MATLABCMD=$pathtomatlabdir/bin/$platform/MATLAB
91100

92-
where, $pathtomatlabdir is the path to your matlab installation and
93-
$platform is the directory referring to x86 or x64 installations
94-
(typically glnxa64 on 64-bit installations).
101+
where ``$pathtomatlabdir`` is the path to your matlab installation and
102+
``$platform`` is the directory referring to x86 or x64 installations
103+
(typically ``glnxa64`` on 64-bit installations).
95104

96105
Avoiding any MATLAB calls from testing
97106
~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~
@@ -102,38 +111,9 @@ On unix systems, set an empty environment variable::
102111

103112
This will skip any tests that require matlab.
104113

105-
Dependencies
114+
Recommended Software
106115
------------
107116

108-
Below is a list of required dependencies, along with additional software
109-
recommendations.
110-
111-
Must Have
112-
~~~~~~~~~
113-
114-
.. note::
115-
116-
Full distributions of Nipype, such as the ones in Anaconda_ or Canopy_, provide
117-
the following packages automatically.
118-
119-
Nibabel_ 1.0 - 1.4
120-
Neuroimaging file i/o library.
121-
122-
Python_ 2.7
123-
124-
NetworkX_ 1.0 - 1.8
125-
Python package for working with complex networks.
126-
127-
NumPy_ 1.3 - 1.7
128-
129-
SciPy_ 0.7 - 0.12
130-
Numpy and Scipy are high-level, optimized scientific computing libraries.
131-
132-
Enthought_ Traits_ 4.0.0 - 4.3.0
133-
134-
Dateutil 1.5 -
135-
136-
137117
Strong Recommendations
138118
~~~~~~~~~~~~~~~~~~~~~~
139119

doc/users/pipeline_tutorial.rst

Lines changed: 6 additions & 23 deletions
Original file line numberDiff line numberDiff line change
@@ -49,7 +49,7 @@ Requirements
4949
- FSL_, FreeSurfer_, Camino_, ConnectomeViewer and MATLAB_ are available and
5050
callable from the command line
5151

52-
- SPM_ 5/8 is installed and callable in matlab
52+
- SPM_ 5/8/12 is installed and callable in matlab
5353

5454
- Space: 3-10 GB
5555

@@ -59,29 +59,12 @@ Checklist for analysis tutorials
5959
For the analysis tutorials, we will be using a slightly modified version of the
6060
FBIRN Phase I travelling data set.
6161

62-
Step 0
63-
~~~~~~
62+
1. Download and extract the `Pipeline tutorial data (429MB).
63+
<https://figshare.com/articles/nipype_tutorial_data/3395806>`_
64+
(md5: d175083784c5167de4ea11b43b37c166)
6465

65-
Download and extract the `Pipeline tutorial data (429MB).
66-
<https://dl.dropbox.com/s/jzgq2nupxyz36bp/nipype-tutorial.tar.bz2>`_
67-
68-
(checksum: 56ed4b7e0aac5627d1724e9c10cd26a7)
69-
70-
71-
Step 1.
72-
~~~~~~~
73-
74-
Ensure that all programs are available by calling ``bet``, ``matlab``
75-
and then ``which spm`` within matlab to ensure you have spm5/8 in your
66+
2. Ensure that all programs are available by calling ``bet``, ``matlab``
67+
and then ``which spm`` within matlab to ensure you have spm5/8/12 in your
7668
matlab path.
7769

78-
Step 2.
79-
~~~~~~~
80-
81-
You can now run the tutorial by typing ``python tutorial_script.py``
82-
within the nipype-tutorial directory. This will run a full first level
83-
analysis on two subjects following by a 1-sample t-test on their first
84-
level results. The next section goes through each section of the
85-
tutorial script and describes what it is doing.
86-
8770
.. include:: ../links_names.txt

doc/users/resource_sched_profiler.rst

Lines changed: 3 additions & 3 deletions
Original file line numberDiff line numberDiff line change
@@ -144,17 +144,17 @@ The pandas_ Python package is required to use this feature.
144144
from nipype.pipeline.plugins.callback_log import log_nodes_cb
145145
args_dict = {'n_procs' : 8, 'memory_gb' : 10, 'status_callback' : log_nodes_cb}
146146
workflow.run(plugin='MultiProc', plugin_args=args_dict)
147-
147+
148148
# ...workflow finishes and writes callback log to '/home/user/run_stats.log'
149-
149+
150150
from nipype.utils.draw_gantt_chart import generate_gantt_chart
151151
generate_gantt_chart('/home/user/run_stats.log', cores=8)
152152
# ...creates gantt chart in '/home/user/run_stats.log.html'
153153

154154
The ``generate_gantt_chart`` function will create an html file that can be viewed
155155
in a browser. Below is an example of the gantt chart displayed in a web browser.
156156
Note that when the cursor is hovered over any particular node bubble or resource
157-
bubble, some additional information is shown in a pop-up.
157+
bubble, some additional information is shown in a pop-up.
158158

159159
* - .. image:: images/gantt_chart.png
160160
:width: 100 %

doc/users/tutorial_101.rst

Lines changed: 11 additions & 6 deletions
Original file line numberDiff line numberDiff line change
@@ -75,7 +75,8 @@ realigner to the smoother in step 5.
7575
**3. Creating and configuring a workflow**
7676

7777
Here we create an instance of a workflow and indicate that it should operate in
78-
the current directory.
78+
the current directory. The workflow's output will be placed in the ``preproc``
79+
directory.
7980

8081
.. testcode::
8182

@@ -128,11 +129,13 @@ above were generated using this.
128129

129130
workflow.write_graph()
130131

131-
This creates two files graph.dot and graph_detailed.dot and if
132+
This creates two files ``graph.dot`` and ``graph_detailed.dot`` inside
133+
``./preproc`` and if
132134
graphviz_ is installed on your system it automatically converts it
133135
to png files. If graphviz is not installed you can take the dot files
134136
and load them in a graphviz visualizer elsewhere. You can specify how detailed
135-
the graph is going to be, by using "graph2use" argument which takes the following
137+
the graph is going to be, by using the ``graph2use`` argument which takes
138+
the following
136139
options:
137140

138141
* hierarchical - creates a graph showing all embedded workflows (default)
@@ -152,9 +155,11 @@ above pipeline.
152155
import nipype.algorithms.rapidart as ra
153156
artdetect = pe.Node(interface=ra.ArtifactDetect(), name='artdetect')
154157
artdetect.inputs.use_differences = [True, False]
155-
art.inputs.use_norm = True
156-
art.inputs.norm_threshold = 0.5
157-
art.inputs.zintensity_threshold = 3
158+
artdetect.inputs.use_norm = True
159+
artdetect.inputs.norm_threshold = 0.5
160+
artdetect.inputs.zintensity_threshold = 3
161+
artdetect.inputs.parameter_source = "SPM"
162+
artdetect.inputs.mask_type = "spm_global"
158163
workflow.connect([(realigner, artdetect,
159164
[('realigned_files', 'realigned_files'),
160165
('realignment_parameters','realignment_parameters')]

0 commit comments

Comments
 (0)