Skip to content

Commit 425c978

Browse files
committed
Updated BIAPs
1 parent 288f777 commit 425c978

File tree

9 files changed

+105
-55
lines changed

9 files changed

+105
-55
lines changed

doc/source/devel/biaps/biap_0000.rst

Lines changed: 1 addition & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -1,4 +1,4 @@
1-
.. _BIAP0:
1+
.. _biap0:
22

33
============================
44
BIAP 0 — Purpose and Process

doc/source/devel/biaps/biap_0001.rst

Lines changed: 1 addition & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -5,6 +5,7 @@ BIAP1 - towards immutable images
55
:Author: Matthew Brett
66
:Status: Rejected
77
:Type: Standards
8+
:Created: 2011-03-23
89

910
**********
1011
Resolution

doc/source/devel/biaps/biap_0002.rst

Lines changed: 41 additions & 24 deletions
Original file line numberDiff line numberDiff line change
@@ -2,11 +2,16 @@
22
BIAP2 : Slicecopy
33
#################
44

5+
:Author: Matthew Brett
6+
:Status: Rejected
7+
:Type: Standards
8+
:Created: 2011-03-26
9+
510
******
611
Status
712
******
813

9-
Implemented as of Nibabel 2.0 with image proxy slicing : see
14+
Alternative implementation as of Nibabel 2.0 with image proxy slicing : see
1015
http://nipy.org/nibabel/images_and_memory.html#saving-time-and-memory
1116

1217
**********
@@ -30,8 +35,8 @@ Questions
3035
Should ``slice0`` be a copy or a view?
3136
--------------------------------------
3237

33-
As from the previous discussion - [[BIAP1]] - an image may be a proxy or an
34-
array.
38+
As from the previous discussion - :doc:`biap_0001` - an image may be a proxy
39+
or an array.
3540

3641
If the image is an array, the most natural thing to return is a view. That is,
3742
modifying ``slice0`` will modify the underlying array in ``img``.
@@ -46,18 +51,18 @@ What slices should the slicing allow?
4651

4752
The ``img.get_slice(0)`` syntax needs us to know what slice 0 is. In a nifti
4853
image of 3 dimensions, the first is fastest changing on disk. To be useful
49-
``0`` will probably refer to the slowest changing on disk. Otherwise we'll have
50-
to load nearly the whole image anyway. So, for a nifti, 0 should be the first
51-
slice in the last dimension.
54+
``0`` will probably refer to the slowest changing on disk. Otherwise we'll
55+
have to load nearly the whole image anyway. So, for a nifti, 0 should be the
56+
first slice in the last dimension.
5257

53-
For Minc on the other hand, you can and I (MB) think always do get C ordered arrays
54-
back, so that the slowest changing dimension in the image array is the first.
55-
Actually, I don't know how to read a minc file slice by slice, but the general
56-
point is that, to know which slice is worth reading, you need to know the
57-
relationship of the image array dimensions to fastest / slowest on disk.
58+
For Minc on the other hand, you can and I (MB) think always do get C ordered
59+
arrays back, so that the slowest changing dimension in the image array is the
60+
first. Actually, I don't know how to read a Minc file slice by slice, but the
61+
general point is that, to know which slice is worth reading, you need to know
62+
the relationship of the image array dimensions to fastest / slowest on disk.
5863

5964
We could always solve this by assuming that we always want to do this for
60-
analyze / nifti1 files (Fortran ordered). It's a little ugly of course.
65+
Analyze / Nifti1 files (Fortran ordered). It's a little ugly of course.
6166

6267
Note that taking the slowest changing slice in a nifti image would be the
6368
equivalent of taking a slice from the last dimension::
@@ -84,14 +89,18 @@ arrays.
8489
Option 1: fancy slice object
8590
============================
8691

87-
It's option 1 because it's the first one I thought of::
92+
It's option 1 because it's the first one I thought of:
93+
94+
.. code:: python
8895
8996
slice0 = img.slicecopy[...,0]
9097
9198
Here we solve the copy or view problem with 'always copy'. We solve the 'what
9299
slicing to allow' by letting the object decide how to do the slicing. We could
93100
obviously just do the full load (deproxy the image) and return a copy of the
94-
sliced array, as in::
101+
sliced array, as in:
102+
103+
.. code:: python
95104
96105
class SomeImage(object):
97106
class Slicer(object):
@@ -106,38 +115,46 @@ sliced array, as in::
106115
def __init__(self, stuff):
107116
self.slicecopy = Slicer(self)
108117
109-
The problem with this is that::
118+
The problem with this is that:
119+
120+
.. code:: python
110121
111122
slice0 = img.slicecopy[...,1]
112123
113-
might unproxy the image. At the moment, it's rather hidden whether the image is
114-
proxied or not on the basis that it's an optimization that should be
124+
might unproxy the image. At the moment, it's rather hidden whether the image
125+
is proxied or not on the basis that it's an optimization that should be
115126
transparent.
116127

117128
Option 2: not-fancy method call
118129
===============================
119130

120-
::
131+
.. code:: python
121132
122133
slice0 = img.get_slice(0, copy=True)
123134
124135
'slice or view' solved with explicit keyword. 'which slice' solved by assuming
125-
you always mean one slice in the last dimension. Or we could also allow::
136+
you always mean one slice in the last dimension. Or we could also allow:
137+
138+
.. code:: python
126139
127140
slices = img.get_slices(slice(0,3), copy=True)
128141
129-
This is ugly, but fairly clear. This simple 'I mean the last dimension' might be
130-
annoying because it assumes the last dimension is the slowest changing, and it
131-
does not get to optimize the more complex contiguous cases above. So we could
132-
even allow full slicing with stuff like::
142+
This is ugly, but fairly clear. This simple 'I mean the last dimension' might
143+
be annoying because it assumes the last dimension is the slowest changing, and
144+
it does not get to optimize the more complex contiguous cases above. So we
145+
could even allow full slicing with stuff like:
146+
147+
.. code:: python
133148
134149
slice = img.get_slices((slice(None), slice(None), slice(3)), copy=True)
135150
136151
Again - this looks a lot more ugly than the ``slicecopy`` syntax above.
137152

138153
Now, when would you choose ``copy=True``? I think, when the image is a proxy.
139154
Otherwise you'd want a view. Probably. So what you mean, probably, is
140-
something like this::
155+
something like this:
156+
157+
.. code:: python
141158
142159
slices = img.get_slices(slicedef, copy_if='is_proxy')
143160

doc/source/devel/biaps/biap_0003.rst

Lines changed: 10 additions & 4 deletions
Original file line numberDiff line numberDiff line change
@@ -2,11 +2,17 @@
22
BIAP3 - a JSON nifti header extension
33
#####################################
44

5-
******
6-
Status
7-
******
5+
:Author: Matthew Brett, Bob Dougherty
6+
:Status: Draft
7+
:Type: Standards
8+
:Created: 2011-03-26
89

9-
Ongoing
10+
The following Wiki documents should be merged with this one:
11+
12+
* `JSON header extension
13+
<https://github.com/nipy/nibabel/wiki/json-header-extension>`_.
14+
* `NIfTI metadata extension
15+
<https://github.com/nipy/nibabel/wiki/NIfTI-metadata-extension>`_
1016

1117
**********
1218
Background

doc/source/devel/biaps/biap_0004.rst

Lines changed: 10 additions & 4 deletions
Original file line numberDiff line numberDiff line change
@@ -2,6 +2,11 @@
22
BIAP4 - merging nibabel and dcmstack
33
####################################
44

5+
:Author: Brendan Moloney, Matthew Brett
6+
:Status: Draft
7+
:Type: Standards
8+
:Created: 2012-11-21
9+
510
In which we set out what dcmstack_ does and how it might integrate with the
611
nibabel objects and functions.
712

@@ -14,10 +19,11 @@ Nifti, before doing any image processing. The Nifti format is significantly
1419
easier to work with and has wide spread compatibility. However, the vast
1520
amount of meta data stored in the source DICOM files will be lost.
1621

17-
After implementing this proposal, users will be able to preserve all of the meta
18-
data from the DICOM files during conversion, including meta data from private
19-
elements. The meta data will then be easily accessible through the `SpatialImage`
20-
API::
22+
After implementing this proposal, users will be able to preserve all of the
23+
meta data from the DICOM files during conversion, including meta data from
24+
private elements. The meta data will then be easily accessible through the
25+
`SpatialImage` API::
26+
2127
>>> nii = nb.load('input.nii')
2228
>>> data = nii.get_data()
2329
>>> print data.shape

doc/source/devel/biaps/biap_0005.rst

Lines changed: 5 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -2,6 +2,11 @@
22
BIAP5 - a streamlines converter
33
###############################
44

5+
:Author: Marc-Alexandre Côté
6+
:Status: Draft
7+
:Type: Standards
8+
:Created: 2013-09-03
9+
510
The first objective of this proposal is to add support to other streamlines
611
format. The second objective is to be able to easily convert from one file
712
format to another.

doc/source/devel/biaps/biap_0006.rst

Lines changed: 16 additions & 9 deletions
Original file line numberDiff line numberDiff line change
@@ -2,11 +2,10 @@
22
BIAP6 - identifying image axes
33
##############################
44

5-
******
6-
Status
7-
******
8-
9-
Ongoing
5+
:Author: Matthew Brett
6+
:Status: Draft
7+
:Type: Standards
8+
:Created: 2015-07-11
109

1110
**********
1211
Background
@@ -16,13 +15,17 @@ Image axes can have meaningful labels.
1615

1716
For example in a typical 4D NIfTI file, as we move along the 4th dimension in
1817
the image array, we are also moving in time. For example, this would be the
19-
first volume (in time)::
18+
first volume (in time):
19+
20+
.. code:: python
2021
2122
img = nibabel.load('my_4d.nii')
2223
data = img.get_data()
2324
vol0 = data[..., 0]
2425
25-
and this would be second volume in time::
26+
and this would be second volume in time:
27+
28+
.. code:: python
2629
2730
vol1 = data[..., 1]
2831
@@ -36,7 +39,9 @@ dimension.
3639

3740
It is common to acquire MRI images one slice at a time. In a 3D or 4D NIfTI,
3841
the 3rd axis often contains these slices. So this these would be the first
39-
and second slices of data collected by the scanner::
42+
and second slices of data collected by the scanner:
43+
44+
.. code:: python
4045
4146
slice0 = vol0[:, :, 0]
4247
slice1 = vol0[:, :, 1]
@@ -84,7 +89,9 @@ length > 1. Quoting from the standard::
8489

8590
This arrangement happens in practice. For example, SPM deformation fields
8691
have three values for each voxel (x, y, z displacement), and have shape (I, J,
87-
K, 1, 3)::
92+
K, 1, 3):
93+
94+
.. code:: python
8895
8996
In [7]: img = nib.load('y_highres001.nii')
9097
In [8]: img.shape

doc/source/devel/biaps/biap_0007.rst

Lines changed: 5 additions & 6 deletions
Original file line numberDiff line numberDiff line change
@@ -2,11 +2,10 @@
22
BIAP7 - loading multiple images
33
###############################
44

5-
******
6-
Status
7-
******
8-
9-
In discussion
5+
:Author: Matthew Brett
6+
:Status: Draft
7+
:Type: Standards
8+
:Created: 2015-07-18
109

1110
**********
1211
Background
@@ -112,4 +111,4 @@ Next steps:
112111
* Make sure there are use-cases where you could wish to call `load` vs. `load_multi` on the same image (perhaps a Nifti image with different affines for each volume)
113112
* Investigate AFNI file formats as a use-case for this.
114113
* Check the `nilearn` codebase, see if `iter_img` and `slice_img` functions might offer a post-`load` alternative. Also check if those functions could be deprecated in favor of slicing / iterating on `dataobj`
115-
* Create a new issue to implement getting an iterator on `dataobj`?
114+
* Create a new issue to implement getting an iterator on `dataobj`?

doc/source/devel/biaps/biap_0008.rst

Lines changed: 16 additions & 7 deletions
Original file line numberDiff line numberDiff line change
@@ -2,9 +2,10 @@
22
BIAP8 - always load image data as floating point
33
################################################
44

5-
******
6-
Status
7-
******
5+
:Author: Matthew Brett
6+
:Status: Accepted
7+
:Type: Standards
8+
:Created: 2018-04-18
89

910
``get_fdata`` shipped as of nibabel 2.2.0.
1011

@@ -29,7 +30,9 @@ the image data after loading into memory.
2930
In detail
3031
=========
3132

32-
At the moment, if you do this::
33+
At the moment, if you do this:
34+
35+
.. code:: python
3336
3437
img = nib.load('my_image.nii')
3538
data = img.get_data()
@@ -43,7 +46,9 @@ data type - here ``np.int16``.
4346
This is very efficient in terms of memory, but it can be a real trap unless
4447
you are careful.
4548

46-
For example, let's say you had a pipeline where you did this::
49+
For example, let's say you had a pipeline where you did this:
50+
51+
.. code:: python
4752
4853
sum = img.get_data().sum()
4954
@@ -57,7 +62,9 @@ image arrays to floating point to get sensible answers.
5762
Current implementation
5863
======================
5964

60-
``get_data`` has the following implementation, at time of writing::
65+
``get_data`` has the following implementation, at time of writing:
66+
67+
.. code:: python
6168
6269
def get_data(self):
6370
""" Return image data from image with any necessary scalng applied
@@ -90,7 +97,9 @@ The future default behavior of nibabel should be to do the thing least likely
9097
to trip you up by accident. But - we do not want the result of ``get_data``
9198
to change silently between nibabel versions.
9299

93-
* step 1: now - add ``get_fdata`` method::
100+
* step 1: now - add ``get_fdata`` method:
101+
102+
.. code:: python
94103
95104
def get_fdata(self, dtype=np.float64):
96105
""" Return floating point image data with necessary scalng applied.

0 commit comments

Comments
 (0)