Skip to content

ds.encoding['unlimited_dims'] not getting updated properly #10647

@xylar

Description

@xylar

What happened?

With the latest release (2025.8.0), I'm having issues with unlimited dimensions. When I select a specific index along the unlimited dimension, the dimension itself gets dropped from the dataset but the dimension remains in ds.encoding['unlimited_dims']. I believe this is not a new behavior. But with the latest release, this now causes an error.

The reproducer below produces this error:

$ ./reproducer.py 
Traceback (most recent call last):
  File "/home/xylar/Desktop/reproducer/./reproducer.py", line 25, in <module>
    ds0.to_netcdf('dataset_time0.nc')
    ~~~~~~~~~~~~~^^^^^^^^^^^^^^^^^^^^
  File "/home/xylar/miniforge3/envs/test/lib/python3.13/site-packages/xarray/core/dataset.py", line 2109, in to_netcdf
    return to_netcdf(  # type: ignore[return-value]  # mypy cannot resolve the overloads:(
        self,
    ...<10 lines>...
        auto_complex=auto_complex,
    )
  File "/home/xylar/miniforge3/envs/test/lib/python3.13/site-packages/xarray/backends/api.py", line 2041, in to_netcdf
    unlimited_dims = _sanitize_unlimited_dims(dataset, unlimited_dims)
  File "/home/xylar/miniforge3/envs/test/lib/python3.13/site-packages/xarray/backends/api.py", line 265, in _sanitize_unlimited_dims
    raise ValueError(msg)
ValueError: Unlimited dimension(s) {'time'} declared in 'dataset.encoding', but not part of current dataset dimensions. Consider removing {'time'} from 'dataset.encoding'.

What did you expect to happen?

I would expect that an unlimited dimension (time in my example) would be dropped from unlimited_dims when it is dropped from the dataset itself or else that .to_netcdf() would not have an issue with extra unlimited_dims that aren't in the dataset (as it apparently didn't before).

Minimal Complete Verifiable Example

#!/usr/bin/env python

import numpy as np
import xarray as xr

# Create coordinates
x = np.arange(5)
time = np.arange(10)

# Create dataset with two dummy variables
ds = xr.Dataset(
    data_vars={
        'var1': (('time', 'x'), np.random.rand(time.size, x.size)),
        'var2': (('time', 'x'), np.random.rand(time.size, x.size)),
    },
    coords={'time': time, 'x': x},
)

# Set time as unlimited and write full dataset
ds.encoding['unlimited_dims'] = ['time']
ds.to_netcdf('dataset.nc')

# Select only the first time entry (drop dimension) and write again
ds0 = ds.isel(time=0)
ds0.to_netcdf('dataset_time0.nc')

MVCE confirmation

  • Minimal example — the example is as focused as reasonably possible to demonstrate the underlying issue in xarray.
  • Complete example — the example is self-contained, including all data and the text of any traceback.
  • Verifiable example — the example copy & pastes into an IPython prompt or Binder notebook, returning the result.
  • New issue — a search of GitHub Issues suggests this is not a duplicate.
  • Recent environment — the issue occurs with the latest version of xarray and its dependencies.

Relevant log output

Traceback (most recent call last):
  File "/home/xylar/Desktop/reproducer/./reproducer.py", line 25, in <module>
    ds0.to_netcdf('dataset_time0.nc')
    ~~~~~~~~~~~~~^^^^^^^^^^^^^^^^^^^^
  File "/home/xylar/miniforge3/envs/test/lib/python3.13/site-packages/xarray/core/dataset.py", line 2109, in to_netcdf
    return to_netcdf(  # type: ignore[return-value]  # mypy cannot resolve the overloads:(
        self,
    ...<10 lines>...
        auto_complex=auto_complex,
    )
  File "/home/xylar/miniforge3/envs/test/lib/python3.13/site-packages/xarray/backends/api.py", line 2041, in to_netcdf
    unlimited_dims = _sanitize_unlimited_dims(dataset, unlimited_dims)
  File "/home/xylar/miniforge3/envs/test/lib/python3.13/site-packages/xarray/backends/api.py", line 265, in _sanitize_unlimited_dims
    raise ValueError(msg)
ValueError: Unlimited dimension(s) {'time'} declared in 'dataset.encoding', but not part of current dataset dimensions. Consider removing {'time'} from 'dataset.encoding'.

Anything else we need to know?

No response

Environment

INSTALLED VERSIONS
------------------
commit: None
python: 3.13.5 | packaged by conda-forge | (main, Jun 16 2025, 08:27:50) [GCC 13.3.0]
python-bits: 64
OS: Linux
OS-release: 6.11.0-121029-tuxedo
machine: x86_64
processor: x86_64
byteorder: little
LC_ALL: None
LANG: en_US.UTF-8
LOCALE: ('en_US', 'UTF-8')
libhdf5: 1.14.6
libnetcdf: 4.9.2

xarray: 2025.8.0
pandas: 2.3.1
numpy: 2.3.2
scipy: 1.16.1
netCDF4: 1.7.2
pydap: None
h5netcdf: None
h5py: None
zarr: None
cftime: 1.6.4
nc_time_axis: None
iris: None
bottleneck: None
dask: None
distributed: None
matplotlib: None
cartopy: None
seaborn: None
numbagg: None
fsspec: None
cupy: None
pint: None
sparse: None
flox: None
numpy_groupies: None
setuptools: None
pip: 25.2
conda: None
pytest: None
mypy: None
IPython: None
sphinx: None

Metadata

Metadata

Assignees

No one assigned

    Labels

    bugtopic-metadataRelating to the handling of metadata (i.e. attrs and encoding)

    Type

    No type

    Projects

    No projects

    Milestone

    No milestone

    Relationships

    None yet

    Development

    No branches or pull requests

    Issue actions