Skip to content
Merged
Show file tree
Hide file tree
Changes from 16 commits
Commits
Show all changes
25 commits
Select commit Hold shift + click to select a range
df57e61
test grain dataloader
tristan-deep Feb 9, 2026
f24f24f
add dataloader to init
tristan-deep Feb 9, 2026
a38ed38
device prefetching for all backends
swpenninga Feb 26, 2026
64276ea
np transforms replaced with keras transforms after gpu fetching
swpenninga Feb 26, 2026
adcdbd4
Merge branch 'main' into feature/grain-dataloader
tristan-deep Mar 6, 2026
a55df3e
merge fixes
tristan-deep Mar 6, 2026
9600259
Merge branch 'main' into feature/grain-dataloader
tristan-deep Mar 11, 2026
1381a77
Merge remote-tracking branch 'origin/main' into feature/grain-dataloader
tristan-deep Mar 12, 2026
adfb049
Merge branch 'main' into feature/grain-dataloader
tristan-deep Mar 13, 2026
795ce21
remove gpu transforms, copy functionality, replace make_dataloader, f…
swpenninga Mar 18, 2026
0160a09
Remove old dataloader and some related code
wesselvannierop Mar 19, 2026
9bcb82f
docstring
wesselvannierop Mar 19, 2026
da4e443
Remove logic related to H5Generator
wesselvannierop Mar 19, 2026
3bbdaed
Fix tests H5DataSource
wesselvannierop Mar 19, 2026
1ae5174
rabbit fixes
swpenninga Mar 20, 2026
7e301c8
added grain dataloader tests
swpenninga Mar 20, 2026
663c9c1
rabbit
wesselvannierop Mar 20, 2026
813c68a
Remove custom json for dataloader
wesselvannierop Mar 20, 2026
10169e6
Bug file locking
wesselvannierop Mar 20, 2026
180dd97
canonicalize axis
wesselvannierop Mar 20, 2026
0a0df20
Simplify logic and more bugproof
wesselvannierop Mar 23, 2026
c5e2f4e
Use public API in nbs
wesselvannierop Mar 23, 2026
520f6c7
More public API
wesselvannierop Mar 23, 2026
6b36b91
Improve docstring (mainly with defaults)
wesselvannierop Mar 23, 2026
a3055ef
output str instead of Path
wesselvannierop Mar 23, 2026
File filter

Filter by extension

Filter by extension


Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
10 changes: 5 additions & 5 deletions docs/source/notebooks/data/zea_data_example.ipynb
Original file line number Diff line number Diff line change
Expand Up @@ -9,7 +9,7 @@
"\n",
"1. Loading data from single file with `zea.File`\n",
"2. Loading data from a group of files with `zea.Dataset`\n",
"3. Loading data in batches with dataloading utilities with `zea.backend.tensorflow.make_dataloader`"
"3. Loading data in batches with dataloading utilities with `zea.data.dataloader.Dataloader`"
]
},
{
Expand Down Expand Up @@ -91,7 +91,7 @@
"import zea\n",
"from zea import init_device, load_file\n",
"from zea.visualize import set_mpl_style\n",
"from zea.backend.tensorflow import make_dataloader"
"from zea.data.dataloader import Dataloader"
]
},
{
Expand Down Expand Up @@ -378,9 +378,9 @@
"cell_type": "markdown",
"metadata": {},
"source": [
"## Loading data with `make_dataloader`\n",
"## Loading data with `Dataloader`\n",
"\n",
"In machine and deep learning workflows, we often want more features like batching, shuffling, and parallel data loading. The `zea.backend.tensorflow.make_dataloader` function provides a convenient way to create a TensorFlow data loader from a zea dataset. This does require a working TensorFlow installation, but does work in combination with any other backend as well. This dataloader is particularly useful for training models. It is important that there is some consistency in the dataset, which is not the case for [PICMUS](https://www.creatis.insa-lyon.fr/Challenge/IEEE_IUS_2016/home). Therefore in this example we will use a small part of the [CAMUS](https://www.creatis.insa-lyon.fr/Challenge/camus/) dataset."
"In machine and deep learning workflows, we often want more features like batching, shuffling, and parallel data loading. The `zea.data.dataloader.Dataloader` class provides a convenient way to create a high-performance data loader from a zea dataset. It is built on Grain and does not require TensorFlow. This dataloader is particularly useful for training models. Consistency of shape is preferred, which is not the case for [PICMUS](https://www.creatis.insa-lyon.fr/Challenge/IEEE_IUS_2016/home). Therefore in this example we will use a small part of the [CAMUS](https://www.creatis.insa-lyon.fr/Challenge/camus/) dataset."
]
},
{
Expand Down Expand Up @@ -460,7 +460,7 @@
],
"source": [
"dataset_path = \"hf://zeahub/camus-sample/val\"\n",
"dataloader = make_dataloader(\n",
"dataloader = Dataloader(\n",
" dataset_path,\n",
" key=\"data/image_sc\",\n",
" batch_size=4,\n",
Expand Down
2 changes: 1 addition & 1 deletion docs/source/notebooks/metrics.rst
Original file line number Diff line number Diff line change
@@ -1,5 +1,5 @@
Metrics
=======
========

.. toctree::
:maxdepth: 1
Expand Down
4 changes: 2 additions & 2 deletions docs/source/notebooks/metrics/lpips_example.ipynb
Original file line number Diff line number Diff line change
Expand Up @@ -78,7 +78,7 @@
"from keras import ops\n",
"\n",
"from zea import init_device\n",
"from zea.backend.tensorflow.dataloader import make_dataloader\n",
"from zea.data.dataloader import Dataloader\n",
"from zea.models.lpips import LPIPS\n",
"from zea.visualize import plot_image_grid, set_mpl_style"
]
Expand Down Expand Up @@ -165,7 +165,7 @@
],
"source": [
"n_imgs = 9\n",
"val_dataset = make_dataloader(\n",
"val_dataset = Dataloader(\n",
" \"hf://zeahub/camus-sample/val\",\n",
" key=\"data/image\",\n",
" batch_size=n_imgs,\n",
Expand Down
Original file line number Diff line number Diff line change
Expand Up @@ -62,7 +62,7 @@
"import numpy as np\n",
"\n",
"from zea import init_device\n",
"from zea.backend.tensorflow.dataloader import make_dataloader\n",
"from zea.data.dataloader import Dataloader\n",
"from zea.visualize import set_mpl_style\n",
"from zea.io_lib import matplotlib_figure_to_numpy, save_video\n",
"\n",
Expand Down Expand Up @@ -142,7 +142,7 @@
"# Load a batch and run both models.\n",
"n_imgs = 1\n",
"INFERENCE_SIZE = 256\n",
"val_dataset = make_dataloader(\n",
"val_dataset = Dataloader(\n",
" \"hf://zeahub/camus-sample/val\",\n",
" key=\"data/image_sc\",\n",
" batch_size=n_imgs,\n",
Expand Down
4 changes: 2 additions & 2 deletions docs/source/notebooks/models/hvae_model_example.ipynb
Original file line number Diff line number Diff line change
Expand Up @@ -79,7 +79,7 @@
"from zea.display import scan_convert_2d\n",
"from zea.agent.selection import UniformRandomLines\n",
"from zea.visualize import set_mpl_style, plot_image_grid\n",
"from zea.backend.tensorflow.dataloader import make_dataloader\n",
"from zea.data.dataloader import Dataloader\n",
"\n",
"init_device(verbose=False)\n",
"set_mpl_style()"
Expand Down Expand Up @@ -135,7 +135,7 @@
}
],
"source": [
"val_dataset = make_dataloader(\n",
"val_dataset = Dataloader(\n",
" \"hf://zeahub/camus-sample/val\",\n",
" key=\"data/image\",\n",
" batch_size=batch_size,\n",
Expand Down
Original file line number Diff line number Diff line change
Expand Up @@ -70,7 +70,7 @@
"from zea import init_device\n",
"import matplotlib.pyplot as plt\n",
"from keras import ops\n",
"from zea.backend.tensorflow.dataloader import make_dataloader\n",
"from zea.data.dataloader import Dataloader\n",
"from zea.visualize import plot_shape_from_mask\n",
"from zea.func import translate\n",
"from zea.visualize import plot_image_grid, set_mpl_style\n",
Expand Down Expand Up @@ -116,7 +116,7 @@
"source": [
"n_imgs = 16\n",
"INFERENCE_SIZE = 256 # Used for both models\n",
"val_dataset = make_dataloader(\n",
"val_dataset = Dataloader(\n",
" \"hf://zeahub/camus-sample/val\",\n",
" key=\"data/image_sc\",\n",
" batch_size=n_imgs,\n",
Expand Down
4 changes: 2 additions & 2 deletions docs/source/notebooks/models/taesd_autoencoder_example.ipynb
Original file line number Diff line number Diff line change
Expand Up @@ -81,7 +81,7 @@
"\n",
"import zea\n",
"from zea import init_device\n",
"from zea.backend.tensorflow.dataloader import make_dataloader\n",
"from zea.data.dataloader import Dataloader\n",
"from zea.models.taesd import TinyAutoencoder\n",
"from zea.visualize import plot_image_grid, set_mpl_style"
]
Expand Down Expand Up @@ -143,7 +143,7 @@
],
"source": [
"n_imgs = 4\n",
"val_dataset = make_dataloader(\n",
"val_dataset = Dataloader(\n",
" \"hf://zeahub/camus-sample/val\",\n",
" key=\"data/image\",\n",
" batch_size=n_imgs,\n",
Expand Down
4 changes: 2 additions & 2 deletions docs/source/notebooks/models/unet_example.ipynb
Original file line number Diff line number Diff line change
Expand Up @@ -77,7 +77,7 @@
"from keras import ops\n",
"\n",
"from zea import init_device, log\n",
"from zea.backend.tensorflow.dataloader import make_dataloader\n",
"from zea.data.dataloader import Dataloader\n",
"from zea.models.unet import UNet\n",
"from zea.models.lpips import LPIPS\n",
"from zea.agent.masks import random_uniform_lines\n",
Expand Down Expand Up @@ -142,7 +142,7 @@
"source": [
"n_imgs = 8\n",
"\n",
"val_dataset = make_dataloader(\n",
"val_dataset = Dataloader(\n",
" \"hf://zeahub/camus-sample/val\",\n",
" key=\"data/image\",\n",
" batch_size=n_imgs,\n",
Expand Down
Loading
Loading