Skip to content

Commit 51cc6f4

Browse files
authored
Merge branch 'main' into maint/update-nitransforms-pin
2 parents bce795c + 359b03b commit 51cc6f4

File tree

18 files changed

+386
-186
lines changed

18 files changed

+386
-186
lines changed

.github/workflows/test.yml

Lines changed: 1 addition & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -115,7 +115,7 @@ jobs:
115115
# Override libiconv pre-installed from anaconda channel
116116
# See https://github.com/conda-forge/libitk-feedstock/issues/98
117117
# Since we're not creating a new environment, we must be explicit
118-
conda install -c conda-forge ants=2.5 libiconv
118+
conda install -c conda-forge ants=2.4 libitk=5.3 libiconv
119119
- name: Verify antsRegistration path
120120
run: |
121121
export PATH=$ANTSPATH:$PATH

AGENTS.md

Lines changed: 107 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -0,0 +1,107 @@
1+
<!--
2+
Copyright The NiPreps Developers <[email protected]>
3+
4+
Licensed under the Apache License, Version 2.0 (the "License");
5+
you may not use this file except in compliance with the License.
6+
You may obtain a copy of the License at
7+
8+
http://www.apache.org/licenses/LICENSE-2.0
9+
10+
Unless required by applicable law or agreed to in writing, software
11+
distributed under the License is distributed on an "AS IS" BASIS,
12+
WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
13+
See the License for the specific language governing permissions and
14+
limitations under the License.
15+
16+
We support and encourage derived works from this project, please read
17+
about our expectations at
18+
19+
https://www.nipreps.org/community/licensing/
20+
-->
21+
22+
# AGENTS instructions
23+
24+
The project's source code lives under `src/nifreeze/` and tests under `tests/`.
25+
26+
## Testing
27+
28+
### Pre-requisites
29+
30+
- Bootstrap version metadata (which will create an `src/nifreeze/_version.py` file):
31+
```
32+
python -m hatch version
33+
```
34+
35+
- Some software needs to be installed prior to testing, for example ANTs
36+
```
37+
conda install -c conda-forge ants=2.4 libitk=5.3 libiconv
38+
```
39+
- Notebooks generate figures with latex commands inside, therefore:
40+
```
41+
sudo apt install texlive texlive-latex-extra texlive-fonts-recommended cm-super dvipng
42+
```
43+
- A number of tests use pre-existing data (stored in the git-annex-enabled GIN G-Node https://gin.g-node.org/nipreps-data/tests-nifreeze) that need be found at location indicated by the environment variable `TEST_DATA_HOME`:
44+
```
45+
uvx datalad-installer --sudo ok git-annex
46+
uv tool install --with=datalad-osf --with=datalad-next datalad
47+
uv tool install --with=datalad-next datalad-osf
48+
datalad wtf # check datalad is installed
49+
50+
# Install the dataset
51+
if [[ ! -d "${TEST_DATA_HOME}" ]]; then
52+
datalad install -rg --source=https://gin.g-node.org/nipreps-data/tests-nifreeze ${TEST_DATA_HOME}
53+
else
54+
cd ${TEST_DATA_HOME}
55+
datalad update --merge -r .
56+
datalad get -r -J4 *
57+
fi
58+
```
59+
Files in GIN's annex can be retrieved using curl by composing the URL like this one for the [`dmri_data/motion_test_data/dwi_motion.h5` file](https://gin.g-node.org/nipreps-data/tests-nifreeze/raw/master/dmri_data/motion_test_data/dwi_motion.h5)
60+
61+
- Some test data comes from DIPY:
62+
```
63+
echo "from dipy.data import fetch_stanford_hardi; fetch_stanford_hardi()" > fetch.py
64+
uv tool install dipy
65+
uv add --script fetch.py dipy
66+
uv run fetch.py
67+
```
68+
69+
Details about testing are found in `.github/workflows/test.yml`
70+
71+
### Unit tests
72+
73+
- Unit tests can be executed with pytest: `pytest tests/`.
74+
- The project includes doctests, which can be run with `pytest --doctest-module src/nifreeze`
75+
76+
### Integration tests and benchmarks
77+
78+
- The full battery of tests can be run through tox (`tox -v`)
79+
- Install tox using `uv tool install --with=tox-uv --with=tox-gh-actions tox`
80+
81+
## Documentation building
82+
83+
Documentation can be built as described in `.github/workflows/docs-build-pr.yml`.
84+
85+
## Linting
86+
87+
Before accepting new PRs, we use the latest version of Ruff to lint the code, as in `.github/workflows/contrib.yml`:
88+
89+
- Check correctness:
90+
```
91+
pipx run ruff check --fix <files>
92+
```
93+
- Reformat:
94+
```
95+
pipx run ruff format <files>
96+
```
97+
98+
## Codex instructions
99+
100+
- Always plan first
101+
- Think harder in the planning phase
102+
- When proposing tasks, highlight potential critical points that could lead to side effects.
103+
104+
## Commits and PRs
105+
106+
- Commit messages should follow the semantic commit conventions, and at least, contain one line with the following format: `<type-code>: <message>` where `<type-code>` indicates the type of comment. Type of comments can be fixes and bugfixes (`fix:`), enhancements and new features (`enh:`), style (`sty:`), documentation (`doc:`), maintenance (`mnt:`), etc.
107+
- PR titles should also be semantic, and use the same Type codes but in all caps (e.g., `FIX:`, `ENH:`, `STY:`, `DOC:`, `STY:`, `MNT:`)

docs/notebooks/data_structures.ipynb

Lines changed: 3 additions & 3 deletions
Original file line numberDiff line numberDiff line change
@@ -93,7 +93,7 @@
9393
}
9494
],
9595
"source": [
96-
"plot_gradients(dwi.gradients.T);"
96+
"plot_gradients(dwi.gradients);"
9797
]
9898
},
9999
{
@@ -192,7 +192,7 @@
192192
}
193193
],
194194
"source": [
195-
"plot_gradients(dwi.gradients.T);"
195+
"plot_gradients(dwi.gradients);"
196196
]
197197
},
198198
{
@@ -213,7 +213,7 @@
213213
],
214214
"source": [
215215
"# Select a b-value\n",
216-
"b2000_gradientmask = dwi.gradients[-1, ...] == 2000\n",
216+
"b2000_gradientmask = dwi.gradients[:, -1] == 2000\n",
217217
"\n",
218218
"# Select b=2000\n",
219219
"data, _, grad = dwi[b2000_gradientmask]\n",

docs/notebooks/pet_motion_estimation.ipynb

Lines changed: 19 additions & 19 deletions
Original file line numberDiff line numberDiff line change
@@ -2428,10 +2428,11 @@
24282428
]
24292429
},
24302430
{
2431-
"metadata": {},
24322431
"cell_type": "code",
2433-
"outputs": [],
24342432
"execution_count": null,
2433+
"id": "d3627d44376b27f4",
2434+
"metadata": {},
2435+
"outputs": [],
24352436
"source": [
24362437
"import numpy as np\n",
24372438
"import pandas as pd\n",
@@ -2441,20 +2442,20 @@
24412442
"# Assume `affines` is the list of affine matrices computed earlier\n",
24422443
"motion_parameters = []\n",
24432444
"\n",
2444-
"for idx, affine in enumerate(affines):\n",
2445+
"for _idx, affine in enumerate(affines):\n",
24452446
" tx, ty, tz, rx, ry, rz = extract_motion_parameters(affine)\n",
24462447
" motion_parameters.append([tx, ty, tz, rx, ry, rz])\n",
24472448
"\n",
24482449
"motion_parameters = np.array(motion_parameters)\n",
24492450
"estimated_fd = compute_fd_from_motion(motion_parameters)"
2450-
],
2451-
"id": "d3627d44376b27f4"
2451+
]
24522452
},
24532453
{
2454-
"metadata": {},
24552454
"cell_type": "code",
2456-
"outputs": [],
24572455
"execution_count": null,
2456+
"id": "4f141ebdb1643673",
2457+
"metadata": {},
2458+
"outputs": [],
24582459
"source": [
24592460
"# Set up the matplotlib figure\n",
24602461
"import matplotlib.pyplot as plt\n",
@@ -2466,20 +2467,20 @@
24662467
"plot_volumewise_motion(np.arange(len(estimated_fd)), motion_parameters)\n",
24672468
"\n",
24682469
"plt.show()"
2469-
],
2470-
"id": "4f141ebdb1643673"
2470+
]
24712471
},
24722472
{
2473-
"metadata": {},
24742473
"cell_type": "markdown",
2475-
"source": "For the dataset used in this example, we have access to the ground truth motion parameters that were used to corrupt the motion-free dataset. Let's now plot the ground truth motion to enable a visual comparison with the estimated motion.",
2476-
"id": "e3f45164598d16f0"
2474+
"id": "e3f45164598d16f0",
2475+
"metadata": {},
2476+
"source": "For the dataset used in this example, we have access to the ground truth motion parameters that were used to corrupt the motion-free dataset. Let's now plot the ground truth motion to enable a visual comparison with the estimated motion."
24772477
},
24782478
{
2479-
"metadata": {},
24802479
"cell_type": "code",
2481-
"outputs": [],
24822480
"execution_count": null,
2481+
"id": "1009ea77e1bdd0ee",
2482+
"metadata": {},
2483+
"outputs": [],
24832484
"source": [
24842485
"from nifreeze.viz.motion_viz import plot_volumewise_motion\n",
24852486
"\n",
@@ -2505,14 +2506,13 @@
25052506
"\n",
25062507
"plt.tight_layout()\n",
25072508
"plt.show()"
2508-
],
2509-
"id": "1009ea77e1bdd0ee"
2509+
]
25102510
},
25112511
{
2512-
"metadata": {},
25132512
"cell_type": "markdown",
2514-
"source": "Let's plot the estimated and the ground truth framewise displacement.",
2515-
"id": "113b4b4d1361b5ec"
2513+
"id": "113b4b4d1361b5ec",
2514+
"metadata": {},
2515+
"source": "Let's plot the estimated and the ground truth framewise displacement."
25162516
},
25172517
{
25182518
"cell_type": "code",

docs/usage.rst

Lines changed: 5 additions & 4 deletions
Original file line numberDiff line numberDiff line change
@@ -34,10 +34,11 @@ To utilize *NiFreeze* functionalities within your Python module or script, follo
3434
Use the appropriate parameters for the particular imaging modality (e.g.
3535
dMRI, fMRI, or PET) that you are using.
3636

37-
For example, for dMRI data, ensure the gradient table is provided. It
38-
should have one column per diffusion-weighted image. The first three rows
39-
represent the gradient directions, and the last row indicates the timing
40-
and strength of the gradients in units of s/mm² ``[ R A S+ b ]``.
37+
For example, for dMRI data, ensure the gradient table is provided. NiFreeze
38+
expects the table to have one row per diffusion-weighted image, with the
39+
first three columns storing the gradient direction components and the last
40+
column indicating the timing and strength of the gradients in units of
41+
s/mm² ``[ R A S+ b ]``.
4142

4243
.. code-block:: python
4344

src/nifreeze/data/base.py

Lines changed: 1 addition & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -101,7 +101,7 @@ def __len__(self) -> int:
101101

102102
def _getextra(self, idx: int | slice | tuple | np.ndarray) -> tuple[Unpack[Ts]]:
103103
"""
104-
Extracts extra fields synchronized with the indexed access of the corresponding data object.
104+
Extract extra fields for a given index of the corresponding data object.
105105
106106
Parameters
107107
----------

0 commit comments

Comments
 (0)