Skip to content
Open
Show file tree
Hide file tree
Changes from all commits
Commits
File filter

Filter by extension

Filter by extension


Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
10 changes: 5 additions & 5 deletions .pre-commit-config.yaml
Original file line number Diff line number Diff line change
@@ -1,6 +1,6 @@
repos:
- repo: https://github.com/pre-commit/pre-commit-hooks
rev: v4.4.0
rev: v4.6.0
hooks:
- id: trailing-whitespace
- id: end-of-file-fixer
Expand All @@ -9,17 +9,17 @@ repos:
- id: check-yaml
- id: debug-statements
- id: mixed-line-ending
- repo: https://github.com/charliermarsh/ruff-pre-commit
rev: 'v0.0.291'
- repo: https://github.com/astral-sh/ruff-pre-commit
rev: 'v0.5.0'
hooks:
- id: ruff
args: [ "--fix" ]
- repo: https://github.com/psf/black
rev: 23.9.1
rev: 24.4.2
hooks:
- id: black
- repo: https://github.com/adamchainz/blacken-docs
rev: "1.16.0"
rev: "1.18.0"
hooks:
- id: blacken-docs
additional_dependencies:
Expand Down
46 changes: 46 additions & 0 deletions CHANGELOG.rst
Original file line number Diff line number Diff line change
@@ -1,6 +1,52 @@
Change Log
----------

Version 1.6.1 (March 7th, 2025):

- Let Variable.chunks return None for scalar variables, independent of what the underlying
h5ds object returns ({pull}`259`).
By `Rickard Holmberg <https://github.com/rho-novatron>`_

Version 1.6.0 (March 7th, 2025):

- Allow specifying `h5netcdf.File(driver="h5pyd")` to force the use of h5pyd ({issue}`255`, {pull}`256`).
By `Rickard Holmberg <https://github.com/rho-novatron>`_
- Add pytest-mypy-plugins for xarray nightly test ({pull}`257`).
By `Kai Mühlbauer <https://github.com/kmuehlbauer>`_

Version 1.5.0 (January 26th, 2025):

- Update CI to new versions (Python 3.13, 3.14 alpha), remove numpy 1 from h5pyd runs ({pull}`250`).
By `Kai Mühlbauer <https://github.com/kmuehlbauer>`_
- Update CI and reinstate h5pyd/hsds test runs ({pull}`247`).
By `John Readey <https://github.com/jreadey>`_
- Allow ``zlib`` to be used as an alias for ``gzip`` for enhanced compatibility with h5netcdf's API and xarray.
By `Mark Harfouche <https://github.com/hmaarrfk>`_

Version 1.4.1 (November 13th, 2024):

- Add CI run for hdf5 1.10.6, fix complex tests, fix enum/user type tests ({pull}`244`).
By `Kai Mühlbauer <https://github.com/kmuehlbauer>`_


Version 1.4.0 (October 7th, 2024):

- Add UserType class, add EnumType ({pull}`229`).
By `Kai Mühlbauer <https://github.com/kmuehlbauer>`_
- Refactor fillvalue and dtype handling for user types, enhance sanity checks and tests ({pull}`230`).
By `Kai Mühlbauer <https://github.com/kmuehlbauer>`_
- Add VLType and CompoundType, commit complex compound type to file. Align with nc-complex ({pull}`227`).
By `Kai Mühlbauer <https://github.com/kmuehlbauer>`_
- Update h5pyd testing.
By `Kai Mühlbauer <https://github.com/kmuehlbauer>`_
- CI and lint maintenance ({pull}`235`).
By `Kai Mühlbauer <https://github.com/kmuehlbauer>`_
- Support wrapping an h5py ``File`` object. Closing the h5netcdf file object
does not close the h5py file ({pull}`238`).
By `Thomas Kluyver <https://github.com/takluyver>`_
- CI and lint maintenance (format README.rst, use more f-strings, change Python 3.9 to 3.10 in CI) ({pull}`239`).
By `Kai Mühlbauer <https://github.com/kmuehlbauer>`_

Version 1.3.0 (November 7th, 2023):

- Add ros3 support by checking `driver`-kwarg.
Expand Down
51 changes: 25 additions & 26 deletions PKG-INFO
Original file line number Diff line number Diff line change
@@ -1,6 +1,6 @@
Metadata-Version: 2.1
Metadata-Version: 2.2
Name: h5netcdf
Version: 1.3.0
Version: 1.6.1
Summary: netCDF4 via h5py
Author-email: Stephan Hoyer <[email protected]>, Kai Mühlbauer <[email protected]>
Maintainer-email: h5netcdf developers <[email protected]>
Expand Down Expand Up @@ -45,6 +45,7 @@ Classifier: Programming Language :: Python :: 3
Classifier: Programming Language :: Python :: 3.9
Classifier: Programming Language :: Python :: 3.10
Classifier: Programming Language :: Python :: 3.11
Classifier: Programming Language :: Python :: 3.12
Classifier: Topic :: Scientific/Engineering
Requires-Python: >=3.9
Description-Content-Type: text/x-rst
Expand Down Expand Up @@ -138,32 +139,32 @@ design is an adaptation of h5py to the netCDF data model. For example:
import h5netcdf
import numpy as np

with h5netcdf.File('mydata.nc', 'w') as f:
with h5netcdf.File("mydata.nc", "w") as f:
# set dimensions with a dictionary
f.dimensions = {'x': 5}
f.dimensions = {"x": 5}
# and update them with a dict-like interface
# f.dimensions['x'] = 5
# f.dimensions.update({'x': 5})

v = f.create_variable('hello', ('x',), float)
v = f.create_variable("hello", ("x",), float)
v[:] = np.ones(5)

# you don't need to create groups first
# you also don't need to create dimensions first if you supply data
# with the new variable
v = f.create_variable('/grouped/data', ('y',), data=np.arange(10))
v = f.create_variable("/grouped/data", ("y",), data=np.arange(10))

# access and modify attributes with a dict-like interface
v.attrs['foo'] = 'bar'
v.attrs["foo"] = "bar"

# you can access variables and groups directly using a hierarchical
# keys like h5py
print(f['/grouped/data'])
print(f["/grouped/data"])

# add an unlimited dimension
f.dimensions['z'] = None
f.dimensions["z"] = None
# explicitly resize a dimension and all variables using it
f.resize_dimension('z', 3)
f.resize_dimension("z", 3)

Notes:

Expand All @@ -184,22 +185,23 @@ The legacy API is designed for compatibility with `netCDF4-python`_. To use it,
.. code-block:: python

import h5netcdf.legacyapi as netCDF4

# everything here would also work with this instead:
# import netCDF4
import numpy as np

with netCDF4.Dataset('mydata.nc', 'w') as ds:
ds.createDimension('x', 5)
v = ds.createVariable('hello', float, ('x',))
with netCDF4.Dataset("mydata.nc", "w") as ds:
ds.createDimension("x", 5)
v = ds.createVariable("hello", float, ("x",))
v[:] = np.ones(5)

g = ds.createGroup('grouped')
g.createDimension('y', 10)
g.createVariable('data', 'i8', ('y',))
v = g['data']
g = ds.createGroup("grouped")
g.createDimension("y", 10)
g.createVariable("data", "i8", ("y",))
v = g["data"]
v[:] = np.arange(10)
v.foo = 'bar'
print(ds.groups['grouped'].variables['data'])
v.foo = "bar"
print(ds.groups["grouped"].variables["data"])

The legacy API is designed to be easy to try-out for netCDF4-python users, but it is not an
exact match. Here is an incomplete list of functionality we don't include:
Expand All @@ -222,9 +224,6 @@ h5py implements some features that do not (yet) result in valid netCDF files:

- Data types:
- Booleans
- Complex values
- Non-string variable length types
- Enum types
- Reference types
- Arbitrary filters:
- Scale-offset filters
Expand All @@ -239,11 +238,11 @@ when creating a file:
.. code-block:: python

# avoid the .nc extension for non-netcdf files
f = h5netcdf.File('mydata.h5', invalid_netcdf=True)
f = h5netcdf.File("mydata.h5", invalid_netcdf=True)
...

# works with the legacy API, too, though compression options are not exposed
ds = h5netcdf.legacyapi.Dataset('mydata.h5', invalid_netcdf=True)
ds = h5netcdf.legacyapi.Dataset("mydata.h5", invalid_netcdf=True)
...

In such cases the `_NCProperties` attribute will not be saved to the file or be removed
Expand Down Expand Up @@ -281,7 +280,7 @@ phony dimensions according to `netCDF`_ behaviour.
.. code-block:: python

# mimic netCDF-behaviour for non-netcdf files
f = h5netcdf.File('mydata.h5', mode='r', phony_dims='sort')
f = h5netcdf.File("mydata.h5", mode="r", phony_dims="sort")
...

Note, that this iterates once over the whole group-hierarchy. This has affects
Expand All @@ -292,7 +291,7 @@ to group access time. The created phony dimension naming will differ from

.. code-block:: python

f = h5netcdf.File('mydata.h5', mode='r', phony_dims='access')
f = h5netcdf.File("mydata.h5", mode="r", phony_dims="access")
...

.. rubric:: Footnotes
Expand Down
46 changes: 22 additions & 24 deletions README.rst
Original file line number Diff line number Diff line change
Expand Up @@ -80,32 +80,32 @@ design is an adaptation of h5py to the netCDF data model. For example:
import h5netcdf
import numpy as np

with h5netcdf.File('mydata.nc', 'w') as f:
with h5netcdf.File("mydata.nc", "w") as f:
# set dimensions with a dictionary
f.dimensions = {'x': 5}
f.dimensions = {"x": 5}
# and update them with a dict-like interface
# f.dimensions['x'] = 5
# f.dimensions.update({'x': 5})

v = f.create_variable('hello', ('x',), float)
v = f.create_variable("hello", ("x",), float)
v[:] = np.ones(5)

# you don't need to create groups first
# you also don't need to create dimensions first if you supply data
# with the new variable
v = f.create_variable('/grouped/data', ('y',), data=np.arange(10))
v = f.create_variable("/grouped/data", ("y",), data=np.arange(10))

# access and modify attributes with a dict-like interface
v.attrs['foo'] = 'bar'
v.attrs["foo"] = "bar"

# you can access variables and groups directly using a hierarchical
# keys like h5py
print(f['/grouped/data'])
print(f["/grouped/data"])

# add an unlimited dimension
f.dimensions['z'] = None
f.dimensions["z"] = None
# explicitly resize a dimension and all variables using it
f.resize_dimension('z', 3)
f.resize_dimension("z", 3)

Notes:

Expand All @@ -126,22 +126,23 @@ The legacy API is designed for compatibility with `netCDF4-python`_. To use it,
.. code-block:: python

import h5netcdf.legacyapi as netCDF4

# everything here would also work with this instead:
# import netCDF4
import numpy as np

with netCDF4.Dataset('mydata.nc', 'w') as ds:
ds.createDimension('x', 5)
v = ds.createVariable('hello', float, ('x',))
with netCDF4.Dataset("mydata.nc", "w") as ds:
ds.createDimension("x", 5)
v = ds.createVariable("hello", float, ("x",))
v[:] = np.ones(5)

g = ds.createGroup('grouped')
g.createDimension('y', 10)
g.createVariable('data', 'i8', ('y',))
v = g['data']
g = ds.createGroup("grouped")
g.createDimension("y", 10)
g.createVariable("data", "i8", ("y",))
v = g["data"]
v[:] = np.arange(10)
v.foo = 'bar'
print(ds.groups['grouped'].variables['data'])
v.foo = "bar"
print(ds.groups["grouped"].variables["data"])

The legacy API is designed to be easy to try-out for netCDF4-python users, but it is not an
exact match. Here is an incomplete list of functionality we don't include:
Expand All @@ -164,9 +165,6 @@ h5py implements some features that do not (yet) result in valid netCDF files:

- Data types:
- Booleans
- Complex values
- Non-string variable length types
- Enum types
- Reference types
- Arbitrary filters:
- Scale-offset filters
Expand All @@ -181,11 +179,11 @@ when creating a file:
.. code-block:: python

# avoid the .nc extension for non-netcdf files
f = h5netcdf.File('mydata.h5', invalid_netcdf=True)
f = h5netcdf.File("mydata.h5", invalid_netcdf=True)
...

# works with the legacy API, too, though compression options are not exposed
ds = h5netcdf.legacyapi.Dataset('mydata.h5', invalid_netcdf=True)
ds = h5netcdf.legacyapi.Dataset("mydata.h5", invalid_netcdf=True)
...

In such cases the `_NCProperties` attribute will not be saved to the file or be removed
Expand Down Expand Up @@ -223,7 +221,7 @@ phony dimensions according to `netCDF`_ behaviour.
.. code-block:: python

# mimic netCDF-behaviour for non-netcdf files
f = h5netcdf.File('mydata.h5', mode='r', phony_dims='sort')
f = h5netcdf.File("mydata.h5", mode="r", phony_dims="sort")
...

Note, that this iterates once over the whole group-hierarchy. This has affects
Expand All @@ -234,7 +232,7 @@ to group access time. The created phony dimension naming will differ from

.. code-block:: python

f = h5netcdf.File('mydata.h5', mode='r', phony_dims='access')
f = h5netcdf.File("mydata.h5", mode="r", phony_dims="access")
...

.. rubric:: Footnotes
Expand Down
59 changes: 59 additions & 0 deletions debian/changelog
Original file line number Diff line number Diff line change
@@ -1,3 +1,62 @@
python-h5netcdf (1.6.1-1) unstable; urgency=medium

* Team upload.
* New upstream release
* Standards-Version: 4.7.2

-- Drew Parsons <[email protected]> Fri, 04 Apr 2025 12:06:33 +0200

python-h5netcdf (1.5.0-1) unstable; urgency=medium

* Team upload.
* New upstream release

-- Drew Parsons <[email protected]> Thu, 20 Feb 2025 20:12:37 +0100

python-h5netcdf (1.4.1-1) unstable; urgency=medium

* Team upload.
* New upstream release
- applies debian patch fix_tests_PR244.patch

-- Drew Parsons <[email protected]> Sun, 24 Nov 2024 23:17:49 +0100

python-h5netcdf (1.4.0-3) unstable; urgency=medium

* Team upload.
* replace debian patches 32bit_skip_complex_type_creation.patch and
test_h5py_TypeError.patch with fix_tests_PR244.patch, applying
upstream PR#244 to fix tests. Closes: #1087199.

-- Drew Parsons <[email protected]> Wed, 13 Nov 2024 00:08:15 +0100

python-h5netcdf (1.4.0-2) unstable; urgency=medium

* Team upload.
* debian patch 32bit_skip_complex_type_creation.patch works around
32-bit complex type test failure. See Bug#1087199.

-- Drew Parsons <[email protected]> Sat, 09 Nov 2024 15:30:00 +0100

python-h5netcdf (1.4.0-1) unstable; urgency=medium

* Team upload.
* New upstream release.
- Build-Depends: python3-packaging
* use pybuild build_dir to build docs
* ignore pycache dir in doc build if it wasn't created
* add debian patches
- doc_no_sphinx_book_theme.patch disables use of sphinx-book-theme
in docs. sphinx-book-theme is essentially unusable, requiring a
specific nodejs version which is not available.
- test_h5py_TypeError.patch catches h5py exception as TypeError
not KeyError. See upstream Issue#236.
* remove generated test files after buildtime testing
* Standards-Version: 4.7.0
* debian/tests: run tests with and without internet access (ros3)

-- Drew Parsons <[email protected]> Fri, 08 Nov 2024 14:18:45 +0100

python-h5netcdf (1.3.0-1) unstable; urgency=medium

* Team upload.
Expand Down
Loading
Loading