|
1 | | -# webKnossos cuber (wkcuber) |
2 | | -[](https://pypi.python.org/pypi/wkcuber) |
3 | | -[](https://pypi.python.org/pypi/wkcuber) |
4 | | -[](https://github.com/scalableminds/webknossos-cuber/actions?query=workflow%3A%22CI%22) |
5 | | -[](https://github.com/psf/black) |
| 1 | +# webKnossos-libs |
| 2 | +<img align="right" src="https://static.webknossos.org/images/oxalis.svg" alt="webKnossos Logo" /> |
6 | 3 |
|
7 | | -Python library for creating and working with [webKnossos](https://webknossos.org) [WKW](https://github.com/scalableminds/webknossos-wrap) datasets. WKW is a container format for efficiently storing large, scale 3D image data as found in (electron) microscopy. |
8 | | - |
9 | | -The tools are modular components to allow easy integration into existing pipelines and workflows. |
10 | | - |
11 | | -## Features |
12 | | - |
13 | | -* `wkcuber`: Convert supported input files to fully ready WKW datasets (includes type detection, downsampling, compressing and metadata generation) |
14 | | -* `wkcuber.convert_image_stack_to_wkw`: Convert image stacks to fully ready WKW datasets (includes downsampling, compressing and metadata generation) |
15 | | -* `wkcuber.export_wkw_as_tiff`: Convert WKW datasets to a tiff stack (writing as tiles to a `z/y/x.tiff` folder structure is also supported) |
16 | | -* `wkcuber.cubing`: Convert image stacks (e.g., `tiff`, `jpg`, `png`, `dm3`, `dm4`) to WKW cubes |
17 | | -* `wkcuber.tile_cubing`: Convert tiled image stacks (e.g. in `z/y/x.ext` folder structure) to WKW cubes |
18 | | -* `wkcuber.convert_knossos`: Convert KNOSSOS cubes to WKW cubes |
19 | | -* `wkcuber.convert_nifti`: Convert NIFTI files to WKW files (Currently without applying transformations). |
20 | | -* `wkcuber.downsampling`: Create downsampled magnifications (with `median`, `mode` and linear interpolation modes). Downsampling compresses the new magnifications by default (disable via `--no-compress`). |
21 | | -* `wkcuber.compress`: Compress WKW cubes for efficient file storage (especially useful for segmentation data) |
22 | | -* `wkcuber.metadata`: Create (or refresh) metadata (with guessing of most parameters) |
23 | | -* `wkcuber.recubing`: Read existing WKW cubes in and write them again specifying the WKW file length. Useful when dataset was written e.g. with file length 1. |
24 | | -* `wkcuber.check_equality`: Compare two WKW datasets to check whether they are equal (e.g., after compressing a dataset, this task can be useful to double-check that the compressed dataset contains the same data). |
25 | | -* Most modules support multiprocessing |
26 | | - |
27 | | -## Supported input formats |
28 | | - |
29 | | -* Standard image formats, e.g. `tiff`, `jpg`, `png`, `bmp` |
30 | | -* Proprietary image formats, e.g. `dm3` |
31 | | -* Tiled image stacks (used for Catmaid) |
32 | | -* KNOSSOS cubes |
33 | | -* NIFTI files |
34 | | - |
35 | | -## Installation |
36 | | -### Python 3 with pip from PyPi |
37 | | -- `wkcuber` requires at least Python 3.6+ |
38 | | - |
39 | | -``` |
40 | | -# Make sure to have lz4 installed: |
41 | | -# Mac: brew install lz4 |
42 | | -# Ubuntu/Debian: apt-get install liblz4-1 |
43 | | -# CentOS/RHEL: yum install lz4 |
44 | | -
|
45 | | -pip install wkcuber |
46 | | -``` |
47 | | - |
48 | | -### Docker |
49 | | -Use the CI-built image: [scalableminds/webknossos-cuber](https://hub.docker.com/r/scalableminds/webknossos-cuber/). Example usage `docker run -v <host path>:/data --rm scalableminds/webknossos-cuber wkcuber --layer_name color --scale 11.24,11.24,25 --name great_dataset /data/source/color /data/target`. |
50 | | - |
51 | | - |
52 | | -## Usage |
53 | | - |
54 | | -``` |
55 | | -# Convert arbitrary, supported input files into wkw datasets. This sets reasonable defaults, but see other commands for customization. |
56 | | -python -m wkcuber \ |
57 | | - --scale 11.24,11.24,25 \ |
58 | | - data/source data/target |
59 | | -
|
60 | | -# Convert image stacks into wkw datasets |
61 | | -python -m wkcuber.convert_image_stack_to_wkw \ |
62 | | - --layer_name color \ |
63 | | - --scale 11.24,11.24,25 \ |
64 | | - --name great_dataset \ |
65 | | - data/source/color data/target |
66 | | -
|
67 | | -# Convert image files to wkw cubes |
68 | | -python -m wkcuber.cubing --layer_name color data/source/color data/target |
69 | | -python -m wkcuber.cubing --layer_name segmentation data/source/segmentation data/target |
70 | | -
|
71 | | -# Convert tiled image files to wkw cubes |
72 | | -python -m wkcuber.tile_cubing --layer_name color data/source data/target |
73 | | -
|
74 | | -# Convert Knossos cubes to wkw cubes |
75 | | -python -m wkcuber.convert_knossos --layer_name color data/source/mag1 data/target |
| 4 | +Collection of libraries around webKnossos, please see the respective subfolders: |
76 | 5 |
|
77 | | -# Convert NIFTI file to wkw file |
78 | | -python -m wkcuber.convert_nifti --layer_name color --scale 10,10,30 data/source/nifti_file data/target |
| 6 | +## [webKnossos cuber (wkcuber)](wkcuber) |
| 7 | +[](https://pypi.python.org/pypi/wkcuber) [](https://pypi.python.org/pypi/wkcuber) |
79 | 8 |
|
80 | | -# Convert folder with NIFTI files to wkw files |
81 | | -python -m wkcuber.convert_nifti --color_file one_nifti_file --segmentation_file --scale 10,10,30 another_nifti data/source/ data/target |
82 | | -
|
83 | | -# Create downsampled magnifications |
84 | | -python -m wkcuber.downsampling --layer_name color data/target |
85 | | -python -m wkcuber.downsampling --layer_name segmentation --interpolation_mode mode data/target |
86 | | -
|
87 | | -# Compress data in-place (mostly useful for segmentation) |
88 | | -python -m wkcuber.compress --layer_name segmentation data/target |
89 | | -
|
90 | | -# Compress data copy (mostly useful for segmentation) |
91 | | -python -m wkcuber.compress --layer_name segmentation data/target data/target_compress |
92 | | -
|
93 | | -# Create metadata |
94 | | -python -m wkcuber.metadata --name great_dataset --scale 11.24,11.24,25 data/target |
95 | | -
|
96 | | -# Refresh metadata so that new layers and/or magnifications are picked up |
97 | | -python -m wkcuber.metadata --refresh data/target |
98 | | -
|
99 | | -# Recubing an existing dataset |
100 | | -python -m wkcuber.recubing --layer_name color --dtype uint8 /data/source/wkw /data/target |
101 | | -
|
102 | | -# Check two datasets for equality |
103 | | -python -m wkcuber.check_equality /data/source /data/target |
104 | | -``` |
105 | | - |
106 | | -### Parallelization |
107 | | - |
108 | | -Most tasks can be configured to be executed in a parallelized manner. Via `--distribution_strategy` you can pass `multiprocessing` or `slurm`. The first can be further configured with `--jobs` and the latter via `--job_resources='{"mem": "10M"}'`. Use `--help` to get more information. |
109 | | - |
110 | | -## Development |
111 | | -Make sure to install all the required dependencies using Poetry: |
112 | | -``` |
113 | | -pip install poetry |
114 | | -poetry install |
115 | | -``` |
116 | | - |
117 | | -Please, format, lint, and unit test your code changes before merging them. |
118 | | -``` |
119 | | -poetry run black . |
120 | | -poetry run pylint -j4 wkcuber |
121 | | -poetry run pytest tests |
122 | | -``` |
123 | | - |
124 | | -Please, run the extended test suite: |
125 | | -``` |
126 | | -tests/scripts/all_tests.sh |
127 | | -``` |
128 | | - |
129 | | -PyPi releases are automatically pushed when creating a new Git tag/Github release. |
130 | | - |
131 | | -## API documentation |
132 | | -Check out the [latest version of the API documentation](https://static.webknossos.org/lib-docs/master/wkcuber/api.html). |
133 | | - |
134 | | -### Generate the API documentation |
135 | | -Run `docs/api.sh` to open a server displaying the API docs. `docs/api.sh --persist` persists the html to `docs/api`. |
136 | | - |
137 | | -## Test Data Credits |
138 | | -Excerpts for testing purposes have been sampled from: |
139 | | -- Dow Jacobo Hossain Siletti Hudspeth (2018). **Connectomics of the zebrafish's lateral-line neuromast reveals wiring and miswiring in a simple microcircuit.** eLife. [DOI:10.7554/eLife.33988](https://elifesciences.org/articles/33988) |
140 | | -- Zheng Lauritzen Perlman Robinson Nichols Milkie Torrens Price Fisher Sharifi Calle-Schuler Kmecova Ali Karsh Trautman Bogovic Hanslovsky Jefferis Kazhdan Khairy Saalfeld Fetter Bock (2018). **A Complete Electron Microscopy Volume of the Brain of Adult Drosophila melanogaster.** Cell. [DOI:10.1016/j.cell.2018.06.019](https://www.cell.com/cell/fulltext/S0092-8674(18)30787-6). License: [CC BY-NC 4.0](https://creativecommons.org/licenses/by-nc/4.0/) |
| 9 | +Python library for creating and working with [webKnossos](https://webknossos.org) [WKW](https://github.com/scalableminds/webknossos-wrap) datasets. WKW is a container format for efficiently storing large, scale 3D image data as found in (electron) microscopy. |
141 | 10 |
|
142 | | -## License |
143 | | -AGPLv3 |
144 | | -Copyright scalable minds |
| 11 | +## [webKnossos](webknossos) Python Library |
| 12 | +:construction: WIP |
0 commit comments