Skip to content

Commit 18419f0

Browse files
authored
use local imports in store ABC to avoid circular import issues (#3372)
* use local imports in store ABC to avoid circular import issues * changelog
1 parent a26926c commit 18419f0

File tree

2 files changed

+10
-4
lines changed

2 files changed

+10
-4
lines changed

changes/3372.misc.rst

Lines changed: 2 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -0,0 +1,2 @@
1+
Make certain imports in ``zarr.abc.store`` local to method definitions. This minimizes the risk of
2+
circular imports when adding new classes to ``zarr.abc.store``.

src/zarr/abc/store.py

Lines changed: 8 additions & 4 deletions
Original file line numberDiff line numberDiff line change
@@ -6,10 +6,6 @@
66
from itertools import starmap
77
from typing import TYPE_CHECKING, Protocol, runtime_checkable
88

9-
from zarr.core.buffer.core import default_buffer_prototype
10-
from zarr.core.common import concurrent_map
11-
from zarr.core.config import config
12-
139
if TYPE_CHECKING:
1410
from collections.abc import AsyncGenerator, AsyncIterator, Iterable
1511
from types import TracebackType
@@ -438,6 +434,9 @@ async def getsize(self, key: str) -> int:
438434
# Note to implementers: this default implementation is very inefficient since
439435
# it requires reading the entire object. Many systems will have ways to get the
440436
# size of an object without reading it.
437+
# avoid circular import
438+
from zarr.core.buffer.core import default_buffer_prototype
439+
441440
value = await self.get(key, prototype=default_buffer_prototype())
442441
if value is None:
443442
raise FileNotFoundError(key)
@@ -476,6 +475,11 @@ async def getsize_prefix(self, prefix: str) -> int:
476475
# on to getting sizes. Ideally we would overlap those two, which should
477476
# improve tail latency and might reduce memory pressure (since not all keys
478477
# would be in memory at once).
478+
479+
# avoid circular import
480+
from zarr.core.common import concurrent_map
481+
from zarr.core.config import config
482+
479483
keys = [(x,) async for x in self.list_prefix(prefix)]
480484
limit = config.get("async.concurrency")
481485
sizes = await concurrent_map(keys, self.getsize, limit=limit)

0 commit comments

Comments
 (0)