Skip to content

An error with the ome2024-ngff-challenge tool #59

@kkyoda

Description

@kkyoda

I am trying to update metadata such as license, organism, and modality in an OME-Zarr filesets converted with bioformats2raw using the ome2024-ngff-challenge tool. Now I am facing an issue where an error occurs when applying it to the following data.

https://ome.github.io/ome-ngff-validator/?source=https://ssbd.riken.jp/100118-dcacbb41/zarr/v0.4/Fig3a_FIB-SEM_synapse.zarr

(ngff-challenge) kyoda@kyoudakacstudio ngff-challenge % ome2024-ngff-challenge resave --cc-by Fig3a_FIB-SEM_synapse.zarr test5.zarr
Traceback (most recent call last):
  File "/Users/kyoda/ngff-challenge/bin/ome2024-ngff-challenge", line 8, in <module>
    sys.exit(dispatch())
             ^^^^^^^^^^
  File "/Users/kyoda/ngff-challenge/lib/python3.11/site-packages/ome2024_ngff_challenge/__init__.py", line 36, in dispatch
    return ns.func(ns)
           ^^^^^^^^^^^
  File "/Users/kyoda/ngff-challenge/lib/python3.11/site-packages/ome2024_ngff_challenge/resave.py", line 433, in main
    convert_image(
  File "/Users/kyoda/ngff-challenge/lib/python3.11/site-packages/ome2024_ngff_challenge/resave.py", line 272, in convert_image
    convert_array(
  File "/Users/kyoda/ngff-challenge/lib/python3.11/site-packages/ome2024_ngff_challenge/resave.py", line 108, in convert_array
    write = ts.open(write_config).result()
            ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
ValueError: Error opening "zarr3" driver: Error writing local file "test5.zarr/0/zarr.json": Cannot create using specified "metadata" and schema: Incompatible chunk size constraints for dimension 3: read size of 1024, write size of 1538 [tensorstore_spec='{\"context\":{\"cache_pool\":{},\"data_copy_concurrency\":{},\"file_io_concurrency\":{},\"file_io_sync\":true},\"create\":true,\"driver\":\"zarr3\",\"dtype\":\"uint8\",\"kvstore\":{\"driver\":\"file\",\"path\":\"test5.zarr/0/\"},\"metadata\":{\"chunk_grid\":{\"configuration\":{\"chunk_shape\":[1,1,1,1538,2048]},\"name\":\"regular\"},\"chunk_key_encoding\":{\"name\":\"default\"},\"codecs\":[{\"configuration\":{\"chunk_shape\":[1,1,1,1024,1024],\"codecs\":[{\"configuration\":{\"endian\":\"little\"},\"name\":\"bytes\"},{\"configuration\":{\"clevel\":5,\"cname\":\"zstd\"},\"name\":\"blosc\"}],\"index_codecs\":[{\"configuration\":{\"endian\":\"little\"},\"name\":\"bytes\"},{\"name\":\"crc32c\"}],\"index_location\":\"end\"},\"name\":\"sharding_indexed\"}],\"data_type\":\"uint8\",\"dimension_names\":[\"t\",\"c\",\"z\",\"y\",\"x\"],\"node_type\":\"array\",\"shape\":[1,1,1,1538,2048]},\"transform\":{\"input_exclusive_max\":[[1],[1],[1],[1538],[2048]],\"input_inclusive_min\":[0,0,0,0,0],\"input_labels\":[\"t\",\"c\",\"z\",\"y\",\"x\"]}}'] [source locations='tensorstore/driver/zarr3/metadata.cc:901\ntensorstore/driver/zarr3/driver.cc:571\ntensorstore/driver/zarr3/driver.cc:571\ntensorstore/internal/cache/kvs_backed_cache.h:208\ntensorstore/driver/driver.cc:112']

I apologize if this is a basic question, but I would appreciate it if anyone could provide a solution. For reference, Zarr filesets with updated metadata were successfully generated for the following data without any issues.

https://ome.github.io/ome-ngff-validator/?source=https://ssbd.riken.jp/100118-dcacbb41/zarr/v0.4/Fig5AC_Mitochondrial_MitoPB.zarr

Metadata

Metadata

Assignees

No one assigned

    Labels

    No labels
    No labels

    Type

    No type

    Projects

    No projects

    Milestone

    No milestone

    Relationships

    None yet

    Development

    No branches or pull requests

    Issue actions