Skip to content

Commit 4afd881

Browse files
committed
Merge branch 'master' into enh/rapidart
* master: (127 commits) fix: doctest carriage return fix: deprecation warning does not reset trait to Undefined - will set new value add support for template files API fix for somaworkflow PEP8 Further refactoring Reafactoring - less redundant code between SGE and PBS. Respect custom execution parameters in graph based models. Doctest fix, better error reporting. fix: config updated before checking for deprecated options put back the n_procs docs Udated docs with n_procs for MultiProc DOC: Added docstring for MpiCommandLine with example usage. DOC: Improve docstring for DataFinder, include example usage. fix: adding atlases fix: removing recommends fix: restore fsl fix: remove fsl for now fix: variable name fix: use neurodebian sources ...
2 parents c30b731 + 8e9c2ba commit 4afd881

Some content is hidden

Large Commits have some content hidden by default. Use the searchbox below for content that may be hidden.

85 files changed

+10401
-2409
lines changed

.travis.yml

Lines changed: 21 additions & 22 deletions
Original file line numberDiff line numberDiff line change
@@ -1,22 +1,21 @@
1-
# vim ft=yaml
2-
# travis-ci.org definition for nipy build
3-
#
4-
# We pretend to be erlang because we need can't use the python support in
5-
# travis-ci; it uses virtualenvs, they do not have numpy, scipy, matplotlib,
6-
# and it is impractical to build them
7-
language: erlang
8-
env:
9-
- PYTHON=python PYSUF=''
10-
# - PYTHON=python3 PYSUF=3 : python3-numpy not currently available
11-
install:
12-
- sudo apt-get install $PYTHON-dev
13-
- sudo apt-get install $PYTHON-numpy
14-
- sudo apt-get install $PYTHON-scipy
15-
- sudo apt-get install $PYTHON-networkx
16-
- sudo apt-get install $PYTHON-traits
17-
- sudo apt-get install $PYTHON-setuptools
18-
- sudo easy_install$PYSUF nibabel # Latest pypi
19-
- sudo apt-get install $PYTHON-nose
20-
script:
21-
# Change into an innocuous directory and find tests from installation
22-
- make test
1+
language: python
2+
python:
3+
- "2.7"
4+
before_install:
5+
- deactivate
6+
- sudo apt-get update -qq
7+
- sudo apt-get install lsb-release
8+
- source /etc/lsb-release
9+
- wget -O- http://neuro.debian.net/lists/${DISTRIB_CODENAME}.us-nh | sudo tee /etc/apt/sources.list.d/neurodebian.sources.list
10+
- sudo apt-key adv --recv-keys --keyserver pgp.mit.edu 2649A5A9
11+
- sudo apt-get update -qq
12+
- sudo apt-get install -qq python-scipy python-nose
13+
- sudo apt-get install -qq python-networkx python-traits python-setuptools
14+
- sudo apt-get install -qq python-nibabel
15+
- sudo apt-get install -qq --no-install-recommends fsl afni
16+
- sudo apt-get install -qq fsl-atlases
17+
- source /etc/fsl/fsl.sh
18+
- virtualenv --system-site-packages ~/virtualenv/this
19+
- source ~/virtualenv/this/bin/activate
20+
install: python setup.py build_ext --inplace
21+
script: make test

CHANGES

Lines changed: 4 additions & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -1,11 +1,14 @@
11
Next release
22
============
33

4-
* ENH: New interfaces: ICC, Meshfix, ants.Register
4+
* ENH: Add basic support for LSF plugin.
5+
* ENH: New interfaces: ICC, Meshfix, ants.Register, C3dAffineTool, ants.JacobianDeterminant
56
* ENH: New workflows: ants template building (both using 'ANTS' and the new 'antsRegistration')
67
* ENH: New examples: how to use ANTS template building workflows (smri_ants_build_tmeplate),
78
how to set SGE specific options (smri_ants_build_template_new)
89
* ENH: added no_flatten option to Merge
10+
* ENH: added deprecation metadata to traits
11+
* ENH: Slicer interfaces were updated to version 4.1
912

1013
Release 0.6.0 (Jun 30, 2012)
1114
============================

doc/devel/interface_specs.rst

Lines changed: 31 additions & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -262,7 +262,37 @@ Common
262262
can be set to either `True` or `False`. `False` indicates that contents
263263
should be symlinked, while `True` indicates that the contents should be
264264
copied over.
265-
265+
266+
``deprecated``
267+
This is metadata for removing or renaming an input field from a spec.::
268+
269+
class RealignInputSpec(BaseInterfaceInputSpec):
270+
jobtype = traits.Enum('estwrite', 'estimate', 'write',
271+
deprecated='0.8',
272+
desc='one of: estimate, write, estwrite',
273+
usedefault=True)
274+
275+
In the above example this means that the `jobtype` input is deprecated and
276+
will be removed in version 0.8. Deprecation should be set to two versions
277+
from current release. Raises `TraitError` after package version crosses the
278+
deprecation version.
279+
280+
``new_name``
281+
For inputs that are being renamed, one can specify the new name of the field.::
282+
283+
class RealignInputSpec(BaseInterfaceInputSpec):
284+
jobtype = traits.Enum('estwrite', 'estimate', 'write',
285+
deprecated='0.8', new_name='job_type',
286+
desc='one of: estimate, write, estwrite',
287+
usedefault=True)
288+
job_type = traits.Enum('estwrite', 'estimate', 'write',
289+
desc='one of: estimate, write, estwrite',
290+
usedefault=True)
291+
292+
In the above example, the `jobtype` field is being renamed to `job_type`.
293+
When `new_name` is provided it must exist as a trait, otherwise an exception
294+
will be raised.
295+
266296
CommandLine
267297
^^^^^^^^^^^
268298

doc/index.rst

Lines changed: 0 additions & 7 deletions
Original file line numberDiff line numberDiff line change
@@ -1,10 +1,3 @@
1-
.. admonition:: Announcement
2-
3-
Nipype Connectivity Workshop 2012 in Magdeburg, Germany: Sep 8-9, 2012:
4-
`Information and Registration`__
5-
6-
__ http://nipype.blogspot.com
7-
81
.. list-table::
92

103
* - .. image:: images/nipype_architecture_overview2.png

doc/users/plugins.rst

Lines changed: 23 additions & 6 deletions
Original file line numberDiff line numberDiff line change
@@ -9,8 +9,8 @@ available plugins allow local and distributed execution of workflows and
99
debugging. Each available plugin is described below.
1010

1111
Current plugins are available for Linear, Multiprocessing, IPython_ distributed
12-
processing platforms and for direct processing on SGE_, PBS_, and Condor_. We
13-
anticipate future plugins for the Soma_ workflow and LSF_.
12+
processing platforms and for direct processing on SGE_, PBS_, Condor_, and LSF_. We
13+
anticipate future plugins for the Soma_ workflow.
1414

1515
.. note::
1616

@@ -34,7 +34,7 @@ Optional arguments::
3434
.. note::
3535

3636
Except for the status_callback, the remaining arguments only apply to the
37-
distributed plugins: MultiProc/IPython(X)/SGE/PBS/Condor
37+
distributed plugins: MultiProc/IPython(X)/SGE/PBS/Condor/LSF
3838

3939
For example:
4040

@@ -71,11 +71,17 @@ a local system.
7171

7272
Optional arguments::
7373

74-
n_procs : Number of processes to launch in parallel
74+
n_procs : Number of processes to launch in parallel, if not set number of
75+
processors/threads will be automatically detected
7576

7677
To distribute processing on a multicore machine, simply call::
7778

78-
workflow.run(plugin='MultiProc', plugin_args={'n_procs' : 2})
79+
workflow.run(plugin='MultiProc')
80+
81+
This will use all available CPUs. If on the other hand you would like to restrict
82+
the number of used resources (to say 2 CPUs), you can call::
83+
84+
workflow.run(plugin='MultiProc', plugin_args={'n_procs' : 2}
7985

8086
IPython
8187
-------
@@ -102,7 +108,7 @@ SGE/PBS
102108
In order to use nipype with SGE_ or PBS_ you simply need to call::
103109

104110
workflow.run(plugin='SGE')
105-
workflow.run(plugin='PBS)
111+
workflow.run(plugin='PBS')
106112

107113
Optional arguments::
108114

@@ -130,6 +136,17 @@ particular node might use more resources than other nodes in a workflow.
130136

131137
node.plugin_args = {'qsub_args': '-l nodes=1:ppn=3', 'overwrite': True}
132138

139+
LSF
140+
---
141+
142+
Submitting via LSF is almost identical to SGE above:
143+
144+
workflow.run(plugin='LSF')
145+
146+
Optional arguments::
147+
148+
template: custom template file to use
149+
bsub_args: any other command line args to be passed to bsub.
133150

134151
Condor
135152
------

examples/dmri_connectivity_advanced.py

Lines changed: 13 additions & 2 deletions
Original file line numberDiff line numberDiff line change
@@ -56,12 +56,13 @@
5656
import nipype.interfaces.cmtk as cmtk
5757
import nipype.interfaces.dipy as dipy
5858
import inspect
59-
import os.path as op # system functions
59+
import os, os.path as op # system functions
6060
from nipype.workflows.dmri.fsl.dti import create_eddy_correct_pipeline
6161
from nipype.workflows.dmri.camino.connectivity_mapping import select_aparc_annot
6262
from nipype.utils.misc import package_check
6363
import warnings
6464
from nipype.workflows.dmri.connectivity.nx import create_networkx_pipeline, create_cmats_to_csv_pipeline
65+
from nipype.workflows.smri.freesurfer import create_tessellation_flow
6566

6667
try:
6768
package_check('cmp')
@@ -82,6 +83,9 @@
8283
fs.FSCommand.set_default_subjects_dir(subjects_dir)
8384
fsl.FSLCommand.set_default_output_type('NIFTI')
8485

86+
fs_dir = os.environ['FREESURFER_HOME']
87+
lookup_file = op.join(fs_dir,'FreeSurferColorLUT.txt')
88+
8589
"""
8690
This needs to point to the fdt folder you can find after extracting
8791
@@ -328,7 +332,7 @@
328332

329333
CFFConverter = pe.Node(interface=cmtk.CFFConverter(), name="CFFConverter")
330334
CFFConverter.inputs.script_files = op.abspath(inspect.getfile(inspect.currentframe()))
331-
giftiSurfaces = pe.Node(interface=util.Merge(8), name="GiftiSurfaces")
335+
giftiSurfaces = pe.Node(interface=util.Merge(9), name="GiftiSurfaces")
332336
giftiLabels = pe.Node(interface=util.Merge(2), name="GiftiLabels")
333337
niftiVolumes = pe.Node(interface=util.Merge(3), name="NiftiVolumes")
334338
fiberDataArrays = pe.Node(interface=util.Merge(4), name="FiberDataArrays")
@@ -344,6 +348,9 @@
344348
NxStatsCFFConverter = pe.Node(interface=cmtk.CFFConverter(), name="NxStatsCFFConverter")
345349
NxStatsCFFConverter.inputs.script_files = op.abspath(inspect.getfile(inspect.currentframe()))
346350

351+
tessflow = create_tessellation_flow(name='tessflow', out_format='gii')
352+
tessflow.inputs.inputspec.lookup_file = lookup_file
353+
347354
"""
348355
Connecting the workflow
349356
=======================
@@ -371,6 +378,9 @@
371378
mapping.connect([(inputnode, FreeSurferSourceRH,[("subjects_dir","subjects_dir")])])
372379
mapping.connect([(inputnode, FreeSurferSourceRH,[("subject_id","subject_id")])])
373380

381+
mapping.connect([(inputnode, tessflow,[("subjects_dir","inputspec.subjects_dir")])])
382+
mapping.connect([(inputnode, tessflow,[("subject_id","inputspec.subject_id")])])
383+
374384
mapping.connect([(inputnode, parcellate,[("subjects_dir","subjects_dir")])])
375385
mapping.connect([(inputnode, parcellate,[("subject_id","subject_id")])])
376386
mapping.connect([(parcellate, mri_convert_ROI_scale500,[('roi_file','in_file')])])
@@ -516,6 +526,7 @@
516526
mapping.connect([(mris_convertRHinflated, giftiSurfaces,[("converted","in6")])])
517527
mapping.connect([(mris_convertLHsphere, giftiSurfaces,[("converted","in7")])])
518528
mapping.connect([(mris_convertRHsphere, giftiSurfaces,[("converted","in8")])])
529+
mapping.connect([(tessflow, giftiSurfaces,[("outputspec.meshes","in9")])])
519530

520531
mapping.connect([(mris_convertLHlabels, giftiLabels,[("converted","in1")])])
521532
mapping.connect([(mris_convertRHlabels, giftiLabels,[("converted","in2")])])

examples/dmri_group_connectivity_mrtrix.py

Lines changed: 0 additions & 6 deletions
Original file line numberDiff line numberDiff line change
@@ -140,12 +140,6 @@
140140

141141
l1pipeline = create_group_connectivity_pipeline(group_list, group_id, data_dir, subjects_dir, output_dir, info)
142142

143-
# This is used to demonstrate the ease through which different parameters can be set for each group.
144-
if group_id == 'parkinsons':
145-
l1pipeline.inputs.connectivity.mapping.threshold_FA.absolute_threshold_value = 0.5
146-
else:
147-
l1pipeline.inputs.connectivity.mapping.threshold_FA.absolute_threshold_value = 0.7
148-
149143
# Here with invert the b-vectors in the Y direction and set the maximum harmonic order of the
150144
# spherical deconvolution step
151145
l1pipeline.inputs.connectivity.mapping.fsl2mrtrix.invert_y = True

0 commit comments

Comments
 (0)