Skip to content

Commit 58167b9

Browse files
Merge pull request #971 from computational-cell-analytics/dev
New release
2 parents acd4886 + 7e82cec commit 58167b9

Some content is hidden

Large Commits have some content hidden by default. Use the searchbox below for content that may be hidden.

54 files changed

+3263
-648
lines changed

.github/workflows/test.yaml

Lines changed: 1 addition & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -31,7 +31,7 @@ jobs:
3131
- name: Setup micromamba
3232
uses: mamba-org/setup-micromamba@v1
3333
with:
34-
environment-file: ${{ runner.os == 'Windows' && 'environment_cpu_win.yaml' || 'environment.yaml' }}
34+
environment-file: 'environment.yaml'
3535
create-args: >-
3636
python=${{ matrix.python-version }}
3737

README.md

Lines changed: 6 additions & 6 deletions
Original file line numberDiff line numberDiff line change
@@ -19,7 +19,7 @@ We implement napari applications for:
1919
<img src="https://github.com/computational-cell-analytics/micro-sam/assets/4263537/dfca3d9b-dba5-440b-b0f9-72a0683ac410" width="256">
2020
<img src="https://github.com/computational-cell-analytics/micro-sam/assets/4263537/aefbf99f-e73a-4125-bb49-2e6592367a64" width="256">
2121

22-
If you run into any problems or have questions regarding our tool please open an issue on Github or reach out via [image.sc](https://forum.image.sc/) using the tag `micro-sam` and tagging @constantinpape.
22+
If you run into any problems or have questions regarding our tool please open an [issue](https://github.com/computational-cell-analytics/micro-sam/issues/new/choose) on Github or reach out via [image.sc](https://forum.image.sc/) using the tag `micro-sam`, and tagging [@constantinpape](https://forum.image.sc/u/constantinpape/summary) and [@anwai98](https://forum.image.sc/u/anwai98/summary).
2323

2424

2525
## Installation and Usage
@@ -31,7 +31,7 @@ Please check [the documentation](https://computational-cell-analytics.github.io/
3131

3232
We welcome new contributions!
3333

34-
If you are interested in contributing to micro-sam, please see the [contributing guide](https://computational-cell-analytics.github.io/micro-sam/micro_sam.html#contribution-guide). The first step is to [discuss your idea in a new issue](https://github.com/computational-cell-analytics/micro-sam/issues/new) with the current developers.
34+
If you are interested in contributing to `micro-sam`, please see the [contributing guide](https://computational-cell-analytics.github.io/micro-sam/micro_sam.html#contribution-guide). The first step is to [discuss your idea in a new issue](https://github.com/computational-cell-analytics/micro-sam/issues/new) with the current developers.
3535

3636

3737
## Citation
@@ -50,12 +50,12 @@ There are a few other napari plugins build around Segment Anything:
5050
- https://github.com/hiroalchem/napari-SAM4IS
5151

5252
Compared to these we support more applications (2d, 3d and tracking), and provide finetuning methods and finetuned models for microscopy data.
53-
[WebKnossos](https://webknossos.org/) also offers integration of SegmentAnything for interactive segmentation.
53+
[WebKnossos](https://webknossos.org/) and [QuPath](https://qupath.github.io/) also offer integration of Segment Anything for interactive segmentation.
5454

5555
We have also built follow-up work that is based on `micro_sam`:
56-
- https://github.com/computational-cell-analytics/patho-sam - improves SAM for histopathology
57-
- https://github.com/computational-cell-analytics/medico-sam - improves it for medical imaging
58-
- https://github.com/computational-cell-analytics/peft-sam - studies parameter efficient fine-tuning for SAM
56+
- https://github.com/computational-cell-analytics/patho-sam - improves SAM for histopathology.
57+
- https://github.com/computational-cell-analytics/medico-sam - improves SAM for medical imaging.
58+
- https://github.com/computational-cell-analytics/peft-sam - studies parameter efficient fine-tuning for SAM.
5959

6060
## Release Oveverview
6161

Lines changed: 101 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -0,0 +1,101 @@
1+
import numpy as np
2+
from shapely import LineString
3+
from skimage.measure import find_contours
4+
5+
import ezomero as ez
6+
7+
from micro_sam.sam_annotator import annotator_2d
8+
9+
10+
def _load_image(conn, upload_segmentation_to_omero):
11+
# Step 1: Use the connecton to access files.
12+
# NOTE: The "project_id" info is located inside the metadata section of your project.
13+
# eg. the "project_id" for "example_data_test" is set to 46.
14+
dataset_ids = ez.get_dataset_ids(conn, project=46)
15+
16+
# - Now that we know the dataset ids, the next sub-groups where data is stored,
17+
# we go ahead with accessing them.
18+
for id in dataset_ids:
19+
image_ids = ez.get_image_ids(conn, dataset=id)
20+
print(image_ids)
21+
22+
# - Once we have identified the image ids, let's open just one of it.
23+
# I will open one z-stack.
24+
# NOTE: Our array is located at pixels.
25+
image_id = 7540
26+
image_obj, pixels = ez.get_image(
27+
conn,
28+
image_id=image_id,
29+
no_pixels=False, # if True, only fetches the meta-data, which is super fast.
30+
axis_lengths=(512, 512, 40, 1, 1), # fetches an ROI in XYZCT config, otherwise the full volume.
31+
)
32+
33+
# Let's annotate stuff using micro-sam.
34+
pixels = pixels.squeeze() # Remove singletons on-the-fly.
35+
36+
# HACK: For segmentation, let's keep it simple and segment the last slice only.
37+
pixels = pixels[-1, ...]
38+
39+
# Run the 2d annotator
40+
viewer = annotator_2d(image=pixels, embedding_path="test_omero.zarr", return_viewer=True)
41+
42+
import napari
43+
napari.run()
44+
45+
if upload_segmentation_to_omero:
46+
47+
# Store the segmentations locally for storing them either as polygons or something else.
48+
segmentation = viewer.layers["committed_objects"].data
49+
50+
# Let's try converting them as polygons, store them as a list of polygons and put it back.
51+
contours = find_contours(segmentation)[0] # Get contours
52+
contour_as_line = LineString(contours) # Convert contours to line structure.
53+
simple_line = contour_as_line.simplify(tolerance=1.0) # Adjust tolerance to make the polygon.
54+
simple_coords = np.array(simple_line.coords)
55+
56+
# Now, let's post a single ROI and see if it worked.
57+
ez.post_roi(conn, image_id=image_id, shapes=[ez.rois.Polygon(simple_coords, z=40)], name="test_micro_sam_seg")
58+
59+
60+
if __name__ == "__main__":
61+
import argparse
62+
parser = argparse.ArgumentParser(description="Run `micro-sam` on OMERO data")
63+
parser.add_argument(
64+
"--host", default="omero-training.gerbi-gmb.de", type=str, help="The host server URL",
65+
)
66+
67+
# NOTE: If the default port below is blocked, use the one below.
68+
# host = "wss://omero-training.gerbi-gmb.de/omero-wss"
69+
# port = 443 # the default port (4064) seems to work for me.
70+
parser.add_argument(
71+
"--port", default=4064, type=int, help="The choice of port for connecting to the server",
72+
)
73+
parser.add_argument(
74+
"--username", default="tim2025_test", type=str, help="The username.",
75+
)
76+
parser.add_argument(
77+
"--password", default="tim2025_test", type=str, help="The correspond user's password."
78+
)
79+
parser.add_argument(
80+
"--omero_group", default="TiM2025_preparation", type=str, help="The OMERO-level group name."
81+
)
82+
parser.add_argument(
83+
"--upload_segmentation", action="store_true", help="Whether to load segmentation as polygons to OMERO."
84+
)
85+
args = parser.parse_args()
86+
87+
# Inspired by https://github.com/I3D-bio/omero_python_workshop.
88+
89+
# Check connection to the test account.
90+
conn = ez.connect(
91+
user=args.username,
92+
password=args.password,
93+
group=args.omero_group,
94+
host=args.host,
95+
port=args.port,
96+
secure=True
97+
)
98+
print(f"Is connected: {conn.isConnected()}")
99+
100+
# Visualize the image from the OMERO server.
101+
_load_image(conn, upload_segmentation_to_omero=args.upload_segmentation)

doc/faq.md

Lines changed: 3 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -166,6 +166,9 @@ NOTE: It is important to choose the correct model type when you opt for the abov
166166
The older model versions are still available on zenodo. You can find the download links for all of them [here](https://computational-cell-analytics.github.io/micro-sam/micro_sam.html#other-models).
167167
You can then use those models with the custom checkpoint option, see answer 15 for details.
168168

169+
### 18. I would like to evaluate the instance segmentation quantitatively. Can you suggest how to do that?
170+
`micro-sam` supports a `micro_sam.evaluate` CLI, which computes the mean segmentation accuracy (introduced in the Pascal VOC challenge) of the predicted instance segmentation with the corresponding ground-truth annotations. Please see our paper (`Methods` -> `Inference and Evaluation` for more details about it) and `$ micro_sam.evaluate -h` for more details about the evaluation CLI.
171+
169172

170173
## Fine-tuning questions
171174

doc/installation.md

Lines changed: 3 additions & 19 deletions
Original file line numberDiff line numberDiff line change
@@ -13,12 +13,10 @@ We do **not support** installing `micro_sam` with pip.
1313

1414
`conda` is a python package manager. If you don't have it installed yet you can follow the instructions [here](https://conda-forge.org/download/) to set it up on your system.
1515
Please make sure that you are using an up-to-date version of conda to install `micro_sam`.
16-
You can also use [mamba](https://mamba.readthedocs.io/en/latest/), which is a drop-in replacement for conda, to install it. In this case, just replace the `conda` command below with `mamba`.
16+
You can also use [mamba](https://mamba.readthedocs.io/en/latest/), which is a drop-in replacement for conda, to install it. In this case, just replace the `conda` commands below with `mamba`.
1717

1818
**IMPORTANT**: Do not install `micro_sam` in the base conda environment.
1919

20-
**Installation on Linux and Mac OS:**
21-
2220
`micro_sam` can be installed in an existing environment via:
2321
```bash
2422
conda install -c conda-forge micro_sam
@@ -39,24 +37,10 @@ However, if you have an older operating system, or a CUDA version older than 12,
3937
conda install -c conda-forge micro_sam "libtorch=*=cuda11*"
4038
```
4139

42-
**Installation on Windows:**
43-
44-
`pytorch` is currently not available on conda-forge for windows. Thus, you have to install it from the `pytorch` conda channel. In addition, you have to specify two specific dependencies to avoid incompatibilities.
45-
This can be done with the following commands:
46-
```bash
47-
conda install -c pytorch -c conda-forge micro_sam "nifty=1.2.1=*_4" "protobuf<5"
48-
```
49-
to install `micro_sam` in an existing environment and
50-
```bash
51-
conda create -c pytorch -c conda-forge -n micro-sam micro_sam "nifty=1.2.1=*_4" "protobuf<5"
52-
```
53-
5440
## From source
5541

5642
To install `micro_sam` from source, we recommend to first set up an environment with the necessary requirements:
57-
- [environment.yaml](https://github.com/computational-cell-analytics/micro-sam/blob/master/environment.yaml): to set up an environment on Linux or Mac OS.
58-
- [environment_cpu_win.yaml](https://github.com/computational-cell-analytics/micro-sam/blob/master/environment_cpu_win.yaml): to set up an environment on windows with CPU support.
59-
- [environment_gpu_win.yaml](https://github.com/computational-cell-analytics/micro-sam/blob/master/environment_gpu_win.yaml): to set up an environment on windows with GPU support.
43+
- [environment.yaml](https://github.com/computational-cell-analytics/micro-sam/blob/master/environment.yaml): to set up an environment on Windows, Linux or Mac OS.
6044

6145
To create one of these environments and install `micro_sam` into it follow these steps
6246

@@ -75,7 +59,7 @@ cd micro-sam
7559
3. Create the respective environment:
7660

7761
```bash
78-
conda env create -f <ENV_FILE>.yaml
62+
conda env create -f environment.yaml
7963
```
8064

8165
4. Activate the environment:

doc/python_library.md

Lines changed: 2 additions & 2 deletions
Original file line numberDiff line numberDiff line change
@@ -32,7 +32,7 @@ In fact the best results can be expected when finetuning on your own data, and w
3232
So a good strategy is to annotate a few images with one of the provided models using our interactive annotation tools and, if the model is not working as good as required for your use-case, finetune on the annotated data.
3333
We recommend checking out our [paper](https://www.nature.com/articles/s41592-024-02580-4) for details on the results on how much data is required for finetuning Segment Anything.
3434

35-
The training logic is implemented in `micro_sam.training` and is based on [torch-em](https://github.com/constantinpape/torch-em). Check out [the finetuning notebook](https://github.com/computational-cell-analytics/micro-sam/blob/master/notebooks/sam_finetuning.ipynb) to see how to use it.
35+
The training logic is implemented in `micro_sam.training` and is based on [torch-em](https://github.com/constantinpape/torch-em). Check out [the finetuning notebook](https://github.com/computational-cell-analytics/micro-sam/blob/master/notebooks/sam_finetuning.ipynb) to see how to use it, or the training CLI (`micro_sam.train`), see `micro_sam.train -h` for details on how to use it.
3636
We also support training an additional decoder for automatic instance segmentation. This yields better results than the automatic mask generation of segment anything and is significantly faster.
3737
The notebook explains how to train it together with the rest of SAM and how to then use it.
3838

@@ -52,4 +52,4 @@ Here is a list of resources, together with their recommended training settings,
5252
| GPU (NVIDIA A100) | 80GB | ViT Large | 2 | *all* | 30 |
5353
| GPU (NVIDIA A100) | 80GB | ViT Huge | 2 | *all* | 25 |
5454

55-
> NOTE: If you use the [finetuning UI](#finetuning-ui) or `micro_sam.training.training.train_sam_for_configuration` you can specify the hardware configuration and the best settings for it will be set automatically. If your hardware is not in the settings we have tested choose the closest match. You can set the training parameters yourself when using `micro_sam.training.training.train_sam`. Be aware that the choice for the number of objects per image, the batch size, and the type of model have a strong impact on the VRAM needed for training and the duration of training. See the [finetuning notebook](https://github.com/computational-cell-analytics/micro-sam/blob/master/notebooks/sam_finetuning.ipynb) for an overview of these parameters.
55+
> NOTE: If you use the [finetuning UI](#finetuning-ui) or the training CLI (`micro_sam.train`) or `micro_sam.training.training.train_sam_for_configuration`, you can specify the hardware configuration and the best settings for it will be set automatically. If your hardware is not in the settings we have tested choose the closest match. You can set the training parameters yourself when using `micro_sam.training.training.train_sam`. Be aware that the choice for the number of objects per image, the batch size, and the type of model have a strong impact on the VRAM needed for training and the duration of training. See the [finetuning notebook](https://github.com/computational-cell-analytics/micro-sam/blob/master/notebooks/sam_finetuning.ipynb) for an overview of these parameters.

doc/start_page.md

Lines changed: 1 addition & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -32,7 +32,7 @@ After installing `micro_sam`, you can start napari from within your environment
3232
```bash
3333
$ napari
3434
```
35-
After starting napari, you can select the annotation tool you want to use from `Plugins -> SegmentAnything for Microscopy`. Check out the [quickstart tutorial video](https://youtu.be/gcv0fa84mCc) for a short introduction, the video of our [virtual I2K tutorial](https://www.youtube.com/watch?v=dxjU4W7bCis&list=PLdA9Vgd1gxTbvxmtk9CASftUOl_XItjDN&index=33) for an in-depth explanation and [the annotation tool section](#annotation-tools) for details.
35+
After starting napari, you can select the annotation tool you want to use from `Plugins -> Segment Anything for Microscopy`. Check out the [quickstart tutorial video](https://youtu.be/gcv0fa84mCc) for a short introduction, the video of our [virtual I2K tutorial](https://www.youtube.com/watch?v=dxjU4W7bCis&list=PLdA9Vgd1gxTbvxmtk9CASftUOl_XItjDN&index=33) for an in-depth explanation and [the annotation tool section](#annotation-tools) for details.
3636

3737
The `micro_sam` python library can be imported via
3838

environment.yaml

Lines changed: 6 additions & 5 deletions
Original file line numberDiff line numberDiff line change
@@ -2,16 +2,16 @@ name: sam
22
channels:
33
- conda-forge
44
dependencies:
5-
- nifty >=1.2.1
5+
- nifty >=1.2.3
66
- imagecodecs
77
- magicgui
8-
- napari >=0.5.0
8+
- napari >=0.5.0,<0.6.0
99
- natsort
1010
- pip
1111
- pooch
1212
- pyqt
1313
- python-xxhash
14-
- python-elf >=0.4.8
14+
- python-elf >=0.6.1
1515
# Note: installing the pytorch package from conda-forge will generally
1616
# give you the most optmized version for your system, if you have a modern
1717
# enough OS and CUDA version (CUDA >= 12). For older versions, you can
@@ -20,11 +20,12 @@ dependencies:
2020
# - libtorch=*=cuda11*
2121
# or, to enforce a CPU installation, change to
2222
# - "pytorch=*=cpu*"
23-
- pytorch >=2.4
23+
- pytorch >=2.5
2424
- segment-anything
2525
- torchvision
26-
- torch_em >=0.7.0
26+
- torch_em >=0.7.8
2727
- tqdm
2828
- timm
29+
- xarray <2025.3.0
2930
- pip:
3031
- git+https://github.com/ChaoningZhang/MobileSAM.git

environment_cpu_win.yaml

Lines changed: 0 additions & 26 deletions
This file was deleted.

environment_gpu_win.yaml

Lines changed: 0 additions & 27 deletions
This file was deleted.

0 commit comments

Comments
 (0)