You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
The annotation tools can be started from the `micro_sam` GUI, the command line or from python scripts. The `micro_sam` GUI can be started by
12
-
```
13
-
$ micro_sam.annotator
14
-
```
11
+
The annotation tools can be started from the napari plugin menu, the command line or from python scripts.
12
+
They are built as napari plugin and make use of existing napari functionality wherever possible. If you are not familiar with napari yet, [start here](https://napari.org/stable/tutorials/fundamentals/quick_start.html).
13
+
The `micro_sam` tools mainly use [the point layer](https://napari.org/stable/howtos/layers/points.html), [shape layer](https://napari.org/stable/howtos/layers/shapes.html) and [label layer](https://napari.org/stable/howtos/layers/labels.html).
15
14
16
-
They are built using [napari](https://napari.org/stable/) and [magicgui](https://pyapp-kit.github.io/magicgui/) to provide the viewer and user interface.
17
-
If you are not familiar with napari yet, [start here](https://napari.org/stable/tutorials/fundamentals/quick_start.html).
18
-
The `micro_sam` tools use [the point layer](https://napari.org/stable/howtos/layers/points.html), [shape layer](https://napari.org/stable/howtos/layers/shapes.html) and [label layer](https://napari.org/stable/howtos/layers/labels.html).
15
+
The annotation tools are explained in detail below. We also provide [video tutorials](TODO).
19
16
20
-
The annotation tools are explained in detail below. In addition to the documentation here we also provide [video tutorials](https://www.youtube.com/watch?v=ket7bDUP9tI&list=PLwYZXQJ3f36GQPpKCrSbHjGiH39X4XjSO).
17
+
The annotation tools can be started from the napari plugin menu:
The annotation toools can be started from a central GUI, which can be started with the command `$ micro_sam.annotator` or using the executable [from an installer](#from-installer).
26
-
27
-
In the GUI you can select with of the four annotation tools you want to use:
And after selecting them a new window will open where you can select the input file path and other optional parameter. Then click the top button to start the tool. **Note: If you are not starting the annotation tool with a path to pre-computed embeddings then it can take several minutes to open napari after pressing the button because the embeddings are being computed.**
31
-
32
21
## Annotator 2D
33
22
34
23
The 2d annotator can be started by
35
-
- clicking `2d annotator` in the `micro_sam` GUI.
24
+
- clicking `Annotator 2d` in the plugin menu.
36
25
- running `$ micro_sam.annotator_2d` in the command line. Run `micro_sam.annotator_2d -h` for details.
37
26
- calling `micro_sam.sam_annotator.annotator_2d` in a python script. Check out [examples/annotator_2d.py](https://github.com/computational-cell-analytics/micro-sam/blob/master/examples/annotator_2d.py) for details.
38
27
@@ -56,14 +45,13 @@ It contains the following elements:
56
45
57
46
Note that point prompts and box prompts can be combined. When you're using point prompts you can only segment one object at a time. With box prompts you can segment several objects at once.
58
47
59
-
Check out [this video](https://youtu.be/ket7bDUP9tI) for a tutorial for the 2d annotation tool.
48
+
Check out [this video](TODO) for a tutorial for the 2d annotation tool.
60
49
61
-
We also provide the `image series annotator`, which can be used for running the 2d annotator for several images in a folder. You can start by clicking `Image series annotator` in the GUI, running `micro_sam.image_series_annotator` in the command line or from a [python script](https://github.com/computational-cell-analytics/micro-sam/blob/master/examples/image_series_annotator.py).
62
50
63
51
## Annotator 3D
64
52
65
53
The 3d annotator can be started by
66
-
- clicking `3d annotator` in the `micro_sam` GUI.
54
+
- clicking `Annotator 3d` in the plugin menu.
67
55
- running `$ micro_sam.annotator_3d` in the command line. Run `micro_sam.annotator_3d -h` for details.
68
56
- calling `micro_sam.sam_annotator.annotator_3d` in a python script. Check out [examples/annotator_3d.py](https://github.com/computational-cell-analytics/micro-sam/blob/master/examples/annotator_3d.py) for details.
69
57
@@ -81,12 +69,13 @@ Most elements are the same as in [the 2d annotator](#annotator-2d):
81
69
82
70
Note that you can only segment one object at a time with the 3d annotator.
83
71
84
-
Check out [this video](https://youtu.be/PEy9-rTCdS4) for a tutorial for the 3d annotation tool.
72
+
Check out [this video](TODO) for a tutorial for the 3d annotation tool.
73
+
85
74
86
75
## Annotator Tracking
87
76
88
77
The tracking annotator can be started by
89
-
- clicking `Tracking annotator` in the `micro_sam` GUI.
78
+
- clicking `Annotator Tracking` in the plugin menu.
90
79
- running `$ micro_sam.annotator_tracking` in the command line. Run `micro_sam.annotator_tracking -h` for details.
91
80
- calling `micro_sam.sam_annotator.annotator_tracking` in a python script. Check out [examples/annotator_tracking.py](https://github.com/computational-cell-analytics/micro-sam/blob/master/examples/annotator_tracking.py) for details.
92
81
@@ -105,7 +94,20 @@ Most elements are the same as in [the 2d annotator](#annotator-2d):
105
94
106
95
Note that the tracking annotator only supports 2d image data, volumetric data is not supported.
107
96
108
-
Check out [this video](https://youtu.be/Xi5pRWMO6_w) for a tutorial for how to use the tracking annotation tool.
97
+
Check out [this video](TODO) for a tutorial for how to use the tracking annotation tool.
98
+
99
+
100
+
## Image Series Annotator
101
+
102
+
TODO
103
+
104
+
We also provide the `image series annotator`, which can be used for running the 2d annotator for several images in a folder. You can start by clicking `Image series annotator` in the GUI, running `micro_sam.image_series_annotator` in the command line or from a [python script](https://github.com/computational-cell-analytics/micro-sam/blob/master/examples/image_series_annotator.py).
You may need to change this command to install the correct CUDA version for your computer, see [https://pytorch.org/](https://pytorch.org/) for details.
29
-
30
-
You also need to install napari to use the annotation tool:
31
-
```
32
-
$ mamba install -c conda-forge napari pyqt
33
-
```
34
-
(We don't include napari in the default installation dependencies to keep the choice of rendering backend flexible.)
28
+
You may need to change this command to install the correct CUDA version for your system, see [https://pytorch.org/](https://pytorch.org/) for details.
35
29
36
30
37
31
## From source
@@ -64,32 +58,16 @@ $ mamba activate sam
64
58
$ pip install -e .
65
59
```
66
60
67
-
**Troubleshooting:**
68
-
69
-
- Installation on MAC with a M1 or M2 processor:
70
-
- The pytorch installation from `environment_cpu.yaml` does not work with a MAC that has an M1 or M2 processor. Instead you need to:
71
-
- Create a new environment: `mamba create -c conda-forge python pip -n sam`
72
-
- Activate it va `mamba activate sam`
73
-
- Follow the instructions for how to install pytorch for MAC via conda from [pytorch.org](https://pytorch.org/).
- Install `micro_sam` by running `pip install -e .` in this folder.
77
-
-**Note:** we have seen many issues with the pytorch installation on MAC. If a wrong pytorch version is installed for you (which will cause pytorch errors once you run the application) please try again with a clean `mambaforge` installation. Please install the `OS X, arm64` version from [here](https://github.com/conda-forge/miniforge#mambaforge).
78
-
- Some MACs require a specific installation order of packages. If the steps layed out above don't work for you please check out the procedure described [in this github issue](https://github.com/computational-cell-analytics/micro-sam/issues/77).
**The installers are still experimental and not fully tested.** Mac is not supported yet, but we are working on also providing an installer for it.
91
-
92
-
If you encounter problems with them then please consider installing `micro_sam` via [mamba](#from-mamba) instead.
70
+
The installers will not enable you to use a GPU, so if you have one then please consider installing `micro_sam` via [mamba](#from-mamba) instead. They will also not enable using the python library.
93
71
94
72
**Linux Installer:**
95
73
@@ -102,6 +80,15 @@ To use the installer:
102
80
- After the installation you can start the annotator with the command `.../micro_sam/bin/micro_sam.annotator`.
103
81
- To make it easier to run the annotation tool you can add `.../micro_sam/bin` to your `PATH` or set a softlink to `.../micro_sam/bin/micro_sam.annotator`.
- Choose installation path. By default it will be installed in `C:\Users\<Username>\micro_sam` for `Just Me` installation or in `C:\ProgramData\micro_sam` for `All Users`.
89
+
- The installer will unpack all micro_sam files to the installation directory.
90
+
- After the installation you can start the annotator by double clicking on `.\micro_sam\Scripts\micro_sam.annotator.exe` or with the command `.\micro_sam\Scripts\micro_sam.annotator.exe` from the Command Prompt.
- Choose installation path. By default it will be installed in `C:\Users\<Username>\micro_sam` for `Just Me` installation or in `C:\ProgramData\micro_sam` for `All Users`.
122
-
- The installer will unpack all micro_sam files to the installation directory.
123
-
- After the installation you can start the annotator by double clicking on `.\micro_sam\Scripts\micro_sam.annotator.exe` or with the command `.\micro_sam\Scripts\micro_sam.annotator.exe` from the Command Prompt.
Copy file name to clipboardExpand all lines: doc/start_page.md
+17-21Lines changed: 17 additions & 21 deletions
Display the source diff
Display the rich diff
Original file line number
Diff line number
Diff line change
@@ -2,45 +2,41 @@
2
2
3
3
Segment Anything for Microscopy implements automatic and interactive annotation for microscopy data. It is built on top of [Segment Anything](https://segment-anything.com/) by Meta AI and specializes it for microscopy and other bio-imaging data.
4
4
Its core components are:
5
-
- The `micro_sam` tools for interactive data annotation with [napari](https://napari.org/stable/).
5
+
- The `micro_sam` tools for interactive data annotation, built as [napari](https://napari.org/stable/) plugin.
6
6
- The `micro_sam` library to apply Segment Anything to 2d and 3d data or fine-tune it on your data.
7
-
- The `micro_sam` models that are fine-tuned on publicly available microscopy data.
7
+
- The `micro_sam` models that are fine-tuned on publicly available microscopy data and that are available on [BioImage.IO](https://bioimage.io/#/).
8
8
9
-
Our goal is to build fast and interactive annotation tools for microscopy data, like interactive cell segmentation from bounding boxes:
9
+
Based on these components `micro_sam` enables fast interactive and automatic annotation for microscopy data, like interactive cell segmentation from bounding boxes:
`micro_sam` is under active development, but our goal is to keep the changes to the user interface and the interface of the python library as small as possible.
14
-
On our roadmap for more functionality are:
15
-
- Providing an installer for running `micro_sam` as a standalone application.
16
-
- Releasing more and better finetuned models as well as the code for fine-tuning.
17
-
- Integration of the finetuned models with [bioimage.io](https://bioimage.io/#/)
18
-
- Implementing a napari plugin for `micro_sam`.
13
+
`micro_sam` is now available as stable version 1.0 and we will not change its user interface significantly in the foreseeable future.
14
+
We are still working on improving and extending its functionality. The current roadmap includes:
15
+
- Releasing more and better finetuned models.
16
+
- Integrating parameter efficient training and compressed models for faster fine-tuning.
17
+
- Improving the 3D segmentation and tracking functionality.
19
18
20
-
If you run into any problems or have questions please open an issue on Github or reach out via [image.sc](https://forum.image.sc/) using the tag `micro-sam` and tagging @constantinpape.
19
+
If you run into any problems or have questions please [open an issue]()or reach out via [image.sc](https://forum.image.sc/) using the tag `micro-sam`.
We also provide experimental installers for all operating systems.
30
-
For more details on the available installation options check out [the installation section](#installation).
28
+
We also provide installers for Windows and Linux. For more details on the available installation options check out [the installation section](#installation).
31
29
32
-
After installing `micro_sam` you can run the annotation tool via `$ micro_sam.annotator`, which opens a menu for selecting the annotation tool and its inputs.
33
-
See [the annotation tool section](#annotation-tools) for an overview and explanation of the annotation functionality.
30
+
After installing `micro_sam` you can start napari and select the annotation tool you want to use from `Plugins->Segment Anything for Microscopy`. Check out the [quickstart tutorial video](TODO) for a short introduction and [the annotation tool section](#annotation-tools) for details.
34
31
35
-
The `micro_sam` python library can be used via
32
+
The `micro_sam` python library can be imported via
36
33
```python
37
34
import micro_sam
38
35
```
39
-
It is explained in more detail [here](#how-to-use-the-python-library).
40
-
41
-
Our support for finetuned models is still experimental. We will soon release better finetuned models and host them on zenodo.
42
-
For now, check out [the example script for the 2d annotator](https://github.com/computational-cell-analytics/micro-sam/blob/master/examples/sam_annotator_2d.py#L62) to see how the finetuned models can be used within `micro_sam`.
36
+
It is explained in more detail [here](#using-the-python-library).
43
37
38
+
We provide different finetuned models for microscopy that can be used within our tools or any other tool that supports Segment Anything. See [finetuned models](#finetuned-models) for details on the available models.
39
+
You can also train models on your own data, see [here for details](#training-your-own-model).
0 commit comments