Skip to content

Commit 5016d3e

Browse files
committed
Merge branch 'master' into enh/versioning
* master: (106 commits) Doctest fix, better error reporting. fix: config updated before checking for deprecated options put back the n_procs docs Udated docs with n_procs for MultiProc DOC: Improve docstring for DataFinder, include example usage. fix: adding atlases fix: removing recommends fix: restore fsl fix: remove fsl for now fix: variable name fix: use neurodebian sources fix: add missing nibabel fix: install nibabel in python dir fix: remove easy_install suffix fix: testing sklearn travis fix BF: Fixes for last commit BF: Use ignore regexes instead of functions so they can pickle. ENH: Added DataFinder interface. added cmdline example pep8 ...
2 parents 4c745fa + 03e64b8 commit 5016d3e

Some content is hidden

Large Commits have some content hidden by default. Use the searchbox below for content that may be hidden.

74 files changed

+3912
-2326
lines changed

.travis.yml

Lines changed: 21 additions & 22 deletions
Original file line numberDiff line numberDiff line change
@@ -1,22 +1,21 @@
1-
# vim ft=yaml
2-
# travis-ci.org definition for nipy build
3-
#
4-
# We pretend to be erlang because we need can't use the python support in
5-
# travis-ci; it uses virtualenvs, they do not have numpy, scipy, matplotlib,
6-
# and it is impractical to build them
7-
language: erlang
8-
env:
9-
- PYTHON=python PYSUF=''
10-
# - PYTHON=python3 PYSUF=3 : python3-numpy not currently available
11-
install:
12-
- sudo apt-get install $PYTHON-dev
13-
- sudo apt-get install $PYTHON-numpy
14-
- sudo apt-get install $PYTHON-scipy
15-
- sudo apt-get install $PYTHON-networkx
16-
- sudo apt-get install $PYTHON-traits
17-
- sudo apt-get install $PYTHON-setuptools
18-
- sudo easy_install$PYSUF nibabel # Latest pypi
19-
- sudo apt-get install $PYTHON-nose
20-
script:
21-
# Change into an innocuous directory and find tests from installation
22-
- make test
1+
language: python
2+
python:
3+
- "2.7"
4+
before_install:
5+
- deactivate
6+
- sudo apt-get update -qq
7+
- sudo apt-get install lsb-release
8+
- source /etc/lsb-release
9+
- wget -O- http://neuro.debian.net/lists/${DISTRIB_CODENAME}.us-nh | sudo tee /etc/apt/sources.list.d/neurodebian.sources.list
10+
- sudo apt-key adv --recv-keys --keyserver pgp.mit.edu 2649A5A9
11+
- sudo apt-get update -qq
12+
- sudo apt-get install -qq python-scipy python-nose
13+
- sudo apt-get install -qq python-networkx python-traits python-setuptools
14+
- sudo apt-get install -qq python-nibabel
15+
- sudo apt-get install -qq --no-install-recommends fsl afni
16+
- sudo apt-get install -qq fsl-atlases
17+
- source /etc/fsl/fsl.sh
18+
- virtualenv --system-site-packages ~/virtualenv/this
19+
- source ~/virtualenv/this/bin/activate
20+
install: python setup.py build_ext --inplace
21+
script: make test

CHANGES

Lines changed: 3 additions & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -1,13 +1,15 @@
11
Next release
22
============
33

4-
* ENH: New interfaces: ICC, Meshfix, ants.Register
4+
* ENH: Add basic support for LSF plugin.
5+
* ENH: New interfaces: ICC, Meshfix, ants.Register, C3dAffineTool, ants.JacobianDeterminant
56
* ENH: New workflows: ants template building (both using 'ANTS' and the new 'antsRegistration')
67
* ENH: New examples: how to use ANTS template building workflows (smri_ants_build_tmeplate),
78
how to set SGE specific options (smri_ants_build_template_new)
89
* ENH: added no_flatten option to Merge
910
* ENH: added versioning option and checking to traits
1011
* ENH: added deprecation metadata to traits
12+
* ENH: Slicer interfaces were updated to version 4.1
1113

1214
Release 0.6.0 (Jun 30, 2012)
1315
============================

doc/users/plugins.rst

Lines changed: 23 additions & 6 deletions
Original file line numberDiff line numberDiff line change
@@ -9,8 +9,8 @@ available plugins allow local and distributed execution of workflows and
99
debugging. Each available plugin is described below.
1010

1111
Current plugins are available for Linear, Multiprocessing, IPython_ distributed
12-
processing platforms and for direct processing on SGE_, PBS_, and Condor_. We
13-
anticipate future plugins for the Soma_ workflow and LSF_.
12+
processing platforms and for direct processing on SGE_, PBS_, Condor_, and LSF_. We
13+
anticipate future plugins for the Soma_ workflow.
1414

1515
.. note::
1616

@@ -34,7 +34,7 @@ Optional arguments::
3434
.. note::
3535

3636
Except for the status_callback, the remaining arguments only apply to the
37-
distributed plugins: MultiProc/IPython(X)/SGE/PBS/Condor
37+
distributed plugins: MultiProc/IPython(X)/SGE/PBS/Condor/LSF
3838

3939
For example:
4040

@@ -71,11 +71,17 @@ a local system.
7171

7272
Optional arguments::
7373

74-
n_procs : Number of processes to launch in parallel
74+
n_procs : Number of processes to launch in parallel, if not set number of
75+
processors/threads will be automatically detected
7576

7677
To distribute processing on a multicore machine, simply call::
7778

78-
workflow.run(plugin='MultiProc', plugin_args={'n_procs' : 2})
79+
workflow.run(plugin='MultiProc')
80+
81+
This will use all available CPUs. If on the other hand you would like to restrict
82+
the number of used resources (to say 2 CPUs), you can call::
83+
84+
workflow.run(plugin='MultiProc', plugin_args={'n_procs' : 2}
7985

8086
IPython
8187
-------
@@ -102,7 +108,7 @@ SGE/PBS
102108
In order to use nipype with SGE_ or PBS_ you simply need to call::
103109

104110
workflow.run(plugin='SGE')
105-
workflow.run(plugin='PBS)
111+
workflow.run(plugin='PBS')
106112

107113
Optional arguments::
108114

@@ -130,6 +136,17 @@ particular node might use more resources than other nodes in a workflow.
130136

131137
node.plugin_args = {'qsub_args': '-l nodes=1:ppn=3', 'overwrite': True}
132138

139+
LSF
140+
---
141+
142+
Submitting via LSF is almost identical to SGE above:
143+
144+
workflow.run(plugin='LSF')
145+
146+
Optional arguments::
147+
148+
template: custom template file to use
149+
bsub_args: any other command line args to be passed to bsub.
133150

134151
Condor
135152
------

examples/dmri_connectivity_advanced.py

Lines changed: 13 additions & 2 deletions
Original file line numberDiff line numberDiff line change
@@ -56,12 +56,13 @@
5656
import nipype.interfaces.cmtk as cmtk
5757
import nipype.interfaces.dipy as dipy
5858
import inspect
59-
import os.path as op # system functions
59+
import os, os.path as op # system functions
6060
from nipype.workflows.dmri.fsl.dti import create_eddy_correct_pipeline
6161
from nipype.workflows.dmri.camino.connectivity_mapping import select_aparc_annot
6262
from nipype.utils.misc import package_check
6363
import warnings
6464
from nipype.workflows.dmri.connectivity.nx import create_networkx_pipeline, create_cmats_to_csv_pipeline
65+
from nipype.workflows.smri.freesurfer import create_tessellation_flow
6566

6667
try:
6768
package_check('cmp')
@@ -82,6 +83,9 @@
8283
fs.FSCommand.set_default_subjects_dir(subjects_dir)
8384
fsl.FSLCommand.set_default_output_type('NIFTI')
8485

86+
fs_dir = os.environ['FREESURFER_HOME']
87+
lookup_file = op.join(fs_dir,'FreeSurferColorLUT.txt')
88+
8589
"""
8690
This needs to point to the fdt folder you can find after extracting
8791
@@ -328,7 +332,7 @@
328332

329333
CFFConverter = pe.Node(interface=cmtk.CFFConverter(), name="CFFConverter")
330334
CFFConverter.inputs.script_files = op.abspath(inspect.getfile(inspect.currentframe()))
331-
giftiSurfaces = pe.Node(interface=util.Merge(8), name="GiftiSurfaces")
335+
giftiSurfaces = pe.Node(interface=util.Merge(9), name="GiftiSurfaces")
332336
giftiLabels = pe.Node(interface=util.Merge(2), name="GiftiLabels")
333337
niftiVolumes = pe.Node(interface=util.Merge(3), name="NiftiVolumes")
334338
fiberDataArrays = pe.Node(interface=util.Merge(4), name="FiberDataArrays")
@@ -344,6 +348,9 @@
344348
NxStatsCFFConverter = pe.Node(interface=cmtk.CFFConverter(), name="NxStatsCFFConverter")
345349
NxStatsCFFConverter.inputs.script_files = op.abspath(inspect.getfile(inspect.currentframe()))
346350

351+
tessflow = create_tessellation_flow(name='tessflow', out_format='gii')
352+
tessflow.inputs.inputspec.lookup_file = lookup_file
353+
347354
"""
348355
Connecting the workflow
349356
=======================
@@ -371,6 +378,9 @@
371378
mapping.connect([(inputnode, FreeSurferSourceRH,[("subjects_dir","subjects_dir")])])
372379
mapping.connect([(inputnode, FreeSurferSourceRH,[("subject_id","subject_id")])])
373380

381+
mapping.connect([(inputnode, tessflow,[("subjects_dir","inputspec.subjects_dir")])])
382+
mapping.connect([(inputnode, tessflow,[("subject_id","inputspec.subject_id")])])
383+
374384
mapping.connect([(inputnode, parcellate,[("subjects_dir","subjects_dir")])])
375385
mapping.connect([(inputnode, parcellate,[("subject_id","subject_id")])])
376386
mapping.connect([(parcellate, mri_convert_ROI_scale500,[('roi_file','in_file')])])
@@ -516,6 +526,7 @@
516526
mapping.connect([(mris_convertRHinflated, giftiSurfaces,[("converted","in6")])])
517527
mapping.connect([(mris_convertLHsphere, giftiSurfaces,[("converted","in7")])])
518528
mapping.connect([(mris_convertRHsphere, giftiSurfaces,[("converted","in8")])])
529+
mapping.connect([(tessflow, giftiSurfaces,[("outputspec.meshes","in9")])])
519530

520531
mapping.connect([(mris_convertLHlabels, giftiLabels,[("converted","in1")])])
521532
mapping.connect([(mris_convertRHlabels, giftiLabels,[("converted","in2")])])

examples/dmri_group_connectivity_mrtrix.py

Lines changed: 0 additions & 6 deletions
Original file line numberDiff line numberDiff line change
@@ -140,12 +140,6 @@
140140

141141
l1pipeline = create_group_connectivity_pipeline(group_list, group_id, data_dir, subjects_dir, output_dir, info)
142142

143-
# This is used to demonstrate the ease through which different parameters can be set for each group.
144-
if group_id == 'parkinsons':
145-
l1pipeline.inputs.connectivity.mapping.threshold_FA.absolute_threshold_value = 0.5
146-
else:
147-
l1pipeline.inputs.connectivity.mapping.threshold_FA.absolute_threshold_value = 0.7
148-
149143
# Here with invert the b-vectors in the Y direction and set the maximum harmonic order of the
150144
# spherical deconvolution step
151145
l1pipeline.inputs.connectivity.mapping.fsl2mrtrix.invert_y = True

nipype/algorithms/misc.py

Lines changed: 42 additions & 31 deletions
Original file line numberDiff line numberDiff line change
@@ -649,32 +649,33 @@ def _list_outputs(self):
649649
return outputs
650650

651651
def merge_csvs(in_list):
652-
for idx, in_file in enumerate(in_list):
653-
try:
654-
in_array = np.loadtxt(in_file, delimiter=',')
655-
except ValueError, ex:
656-
try:
657-
in_array = np.loadtxt(in_file, delimiter=',', skiprows=1)
658-
except ValueError, ex:
659-
first = open(in_file, 'r')
660-
header_line = first.readline()
661-
header_list = header_line.split(',')
662-
n_cols = len(header_list)
663-
try:
664-
in_array = np.loadtxt(in_file, delimiter=',', skiprows=1, usecols=range(1,n_cols))
665-
except ValueError, ex:
666-
in_array = np.loadtxt(in_file, delimiter=',', skiprows=1, usecols=range(1,n_cols-1))
667-
if idx == 0:
668-
out_array = in_array
669-
else:
670-
out_array = np.dstack((out_array, in_array))
671-
out_array = np.squeeze(out_array)
672-
iflogger.info('Final output array shape:')
673-
iflogger.info(np.shape(out_array))
674-
return out_array
652+
for idx, in_file in enumerate(in_list):
653+
try:
654+
in_array = np.loadtxt(in_file, delimiter=',')
655+
except ValueError, ex:
656+
try:
657+
in_array = np.loadtxt(in_file, delimiter=',', skiprows=1)
658+
except ValueError, ex:
659+
first = open(in_file, 'r')
660+
header_line = first.readline()
661+
header_list = header_line.split(',')
662+
n_cols = len(header_list)
663+
try:
664+
in_array = np.loadtxt(in_file, delimiter=',', skiprows=1, usecols=range(1,n_cols))
665+
except ValueError, ex:
666+
in_array = np.loadtxt(in_file, delimiter=',', skiprows=1, usecols=range(1,n_cols-1))
667+
if idx == 0:
668+
out_array = in_array
669+
else:
670+
out_array = np.dstack((out_array, in_array))
671+
out_array = np.squeeze(out_array)
672+
iflogger.info('Final output array shape:')
673+
iflogger.info(np.shape(out_array))
674+
return out_array
675675

676676
def remove_identical_paths(in_files):
677677
import os.path as op
678+
from nipype.utils.filemanip import split_filename
678679
if len(in_files) > 1:
679680
out_names = list()
680681
commonprefix = op.commonprefix(in_files)
@@ -699,24 +700,27 @@ def maketypelist(rowheadings, shape, extraheadingBool, extraheading):
699700
for idx in range(1,(min(shape)+1)):
700701
typelist.append((str(idx), float))
701702
else:
702-
typelist.append((str(1), float))
703+
for idx in range(1,(shape[0]+1)):
704+
typelist.append((str(idx), float))
703705
if extraheadingBool:
704706
typelist.append((extraheading, 'a40'))
705707
iflogger.info(typelist)
706708
return typelist
707709

708710
def makefmtlist(output_array, typelist, rowheadingsBool, shape, extraheadingBool):
709-
output = np.zeros(max(shape), typelist)
710711
fmtlist = []
711712
if rowheadingsBool:
712713
fmtlist.append('%s')
713714
if len(shape) > 1:
715+
output = np.zeros(max(shape), typelist)
714716
for idx in range(1,min(shape)+1):
715717
output[str(idx)] = output_array[:,idx-1]
716718
fmtlist.append('%f')
717719
else:
718-
output[str(1)] = output_array
719-
fmtlist.append('%f')
720+
output = np.zeros(1, typelist)
721+
for idx in range(1,len(output_array)+1):
722+
output[str(idx)] = output_array[idx-1]
723+
fmtlist.append('%f')
720724
if extraheadingBool:
721725
fmtlist.append('%s')
722726
fmt = ','.join(fmtlist)
@@ -727,6 +731,7 @@ class MergeCSVFilesInputSpec(TraitedSpec):
727731
out_file = File('merged.csv', usedefault=True, desc='Output filename for merged CSV file')
728732
column_headings = traits.List(traits.Str, desc='List of column headings to save in merged CSV file (must be equal to number of input files). If left undefined, these will be pulled from the input filenames.')
729733
row_headings = traits.List(traits.Str, desc='List of row headings to save in merged CSV file (must be equal to number of rows in the input files).')
734+
row_heading_title = traits.Str('label', usedefault=True, desc='Column heading for the row headings added')
730735
extra_column_heading = traits.Str(desc='New heading to add for the added field.')
731736
extra_field = traits.Str(desc='New field to add to each row. This is useful for saving the group or subject ID in the file.')
732737

@@ -756,6 +761,7 @@ class MergeCSVFiles(BaseInterface):
756761

757762
def _run_interface(self, runtime):
758763
extraheadingBool = False
764+
extraheading = ''
759765
rowheadingsBool = False
760766
"""
761767
This block defines the column headings.
@@ -775,14 +781,15 @@ def _run_interface(self, runtime):
775781
extraheading = 'type'
776782
iflogger.info('Extra column heading was not defined. Using "type"')
777783
headings.append(extraheading)
778-
extraheadingBool = True
784+
extraheadingBool = True
779785

780786
if len(self.inputs.in_files) == 1:
781787
iflogger.warn('Only one file input!')
782788

783789
if isdefined(self.inputs.row_headings):
784790
iflogger.info('Row headings have been provided. Adding "labels" column header.')
785-
csv_headings = '"labels","' + '","'.join(itertools.chain(headings)) + '"\n'
791+
prefix = '"{p}","'.format(p=self.inputs.row_heading_title)
792+
csv_headings = prefix + '","'.join(itertools.chain(headings)) + '"\n'
786793
rowheadingsBool = True
787794
else:
788795
iflogger.info('Row headings have not been provided.')
@@ -814,12 +821,16 @@ def _run_interface(self, runtime):
814821
for row_heading in row_heading_list:
815822
row_heading_with_quotes = '"' + row_heading + '"'
816823
row_heading_list_with_quotes.append(row_heading_with_quotes)
817-
row_headings = np.array(row_heading_list_with_quotes)
824+
row_headings = np.array(row_heading_list_with_quotes, dtype='|S40')
818825
output['heading'] = row_headings
819826

820827
if isdefined(self.inputs.extra_field):
821828
extrafieldlist = []
822-
for idx in range(0,max(shape)):
829+
if len(shape) > 1:
830+
mx = shape[0]
831+
else:
832+
mx = 1
833+
for idx in range(0,mx):
823834
extrafieldlist.append(self.inputs.extra_field)
824835
iflogger.info(len(extrafieldlist))
825836
output[extraheading] = extrafieldlist

nipype/algorithms/modelgen.py

Lines changed: 1 addition & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -278,7 +278,7 @@ def _generate_standard_design(self, infolist,
278278
sessinfo.insert(i, dict(cond=[]))
279279
if isdefined(self.inputs.high_pass_filter_cutoff):
280280
sessinfo[i]['hpf'] = np.float(self.inputs.high_pass_filter_cutoff)
281-
if hasattr(info, 'conditions') and info.conditions:
281+
if hasattr(info, 'conditions') and info.conditions is not None:
282282
for cid, cond in enumerate(info.conditions):
283283
sessinfo[i]['cond'].insert(cid, dict())
284284
sessinfo[i]['cond'][cid]['name'] = info.conditions[cid]

nipype/interfaces/ants/__init__.py

Lines changed: 1 addition & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -14,4 +14,4 @@
1414
from .segmentation import Atropos, N4BiasFieldCorrection
1515

1616
# Utility Programs
17-
from .utils import AverageAffineTransform, AverageImages, MultiplyImages
17+
from .utils import AverageAffineTransform, AverageImages, MultiplyImages, JacobianDeterminant

0 commit comments

Comments
 (0)