You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
Copy file name to clipboardExpand all lines: README.md
+42-26Lines changed: 42 additions & 26 deletions
Display the source diff
Display the rich diff
Original file line number
Diff line number
Diff line change
@@ -62,50 +62,66 @@ pip install -e .
62
62
63
63
## Usage
64
64
65
-
After the installation the three applications for interactive annotations can be started from the command line or within a python script:
66
-
-**2d segmentation**: via the command `micro_sam.annotator_2d` or with the function `micro_sam.sam_annotator.annotator_2d` from python. Run `micro_sam.annotator_2d -h` or check out [examples/sam_annotator_2d](https://github.com/computational-cell-analytics/micro-sam/blob/master/examples/sam_annotator_2d.py) for details.
67
-
-**3d segmentation**: via the command `micro_sam.annotator_3d` or with the function `micro_sam.sam_annotator.annotator_3d` from python. Run `micro_sam.annotator_3d -h` or check out [examples/sam_annotator_3d](https://github.com/computational-cell-analytics/micro-sam/blob/master/examples/sam_annotator_3d.py) for details.
68
-
-**tracking**: via the command `micro_sam.annotator_tracking` or with the function `micro_sam.sam_annotator.annotator_tracking` from python. Run `micro_sam.annotator_tracking -h` or check out [examples/sam_annotator_tracking](https://github.com/computational-cell-analytics/micro-sam/blob/master/examples/sam_annotator_tracking.py) for details.
69
-
70
-
All three applications are built with napari. If you are not familiar with napari yet, start [here](https://napari.org/stable/tutorials/fundamentals/quick_start.html).
65
+
After installing the `micro_sam` python application the three interactive annotation tools can be started from the command line or from a python script (see details below).
66
+
They are built with napari to implement the viewer and user interaction. If you are not familiar with napari yet, [start here](https://napari.org/stable/tutorials/fundamentals/quick_start.html).
67
+
To use the apps the functionality of [napari point layers](https://napari.org/stable/howtos/layers/points.html) and [napari labels layers](https://napari.org/stable/howtos/layers/labels.html) is of particular importance.
71
68
72
69
### 2D Segmentation
73
70
71
+
The application for 2d segmentation can be started in two ways:
72
+
- Via the command line with the command `micro_sam.annotator_2d`. Run `micro_sam.annotator_2d -h` for details.
73
+
- From a python script with the function `micro_sam.sam_annotator.annotator_2d`. Check out [examples/sam_annotator_2d](https://github.com/computational-cell-analytics/micro-sam/blob/master/examples/sam_annotator_2d.py) for details.
74
+
75
+
Below you can see the interface of the application for a cell segmentation example:
The most important parts of the user interface are:
80
+
1. The napari layers that contain the image, segmentations and prompts:
81
+
-`prompts`: point layer that is used to provide prompts to SegmentAnything. Positive prompts (green points) for marking the object you want to segment, negative prompts (red points) for marking the outside of the object.
82
+
-`current_object`: label layer that contains the object you're currently segmenting.
83
+
-`committed_objects`: label layer with the objects that have already been segmented.
84
+
-`auto_segmentation`: label layer results from using SegmentAnything for automatic instance segmentation.
85
+
-`raw`: image layer that shows the image data.
86
+
2. The prompt menu for changing the currently selected point from positive to negative and vice versa. This can also be done by pressing `t`.
87
+
3. The menu for automatic segmentation. Pressing `Segment All Objects` will run automatic segmentation (this can take few minutes if you are using a CPU). The results will be displayed in the `auto_segmentation` layer.
88
+
4. The menu for interactive segmentation. Pressing `Segment Object` (or `s`) will run segmentation for the current prompts. The result is displayed in `current_object`
89
+
5. The menu for commiting the segmentation. When pressing `Commit` (or `c`) the result from the selected layer (either `current_object` or `auto_segmentation`) will be transferred from the respective layer to `committed_objects`.
82
90
83
-
TODO link to tutorial video.
91
+
Check out [this video](https://youtu.be/DfWE_XRcqN8) for an overview of the interactive 2d segmentation functionality.
84
92
85
93
### 3D Segmentation
86
94
95
+
The application for 3d segmentation can be started as follows:
96
+
- Via the command line with the command `micro_sam.annotator_3d`. Run `micro_sam.annotator_3d -h` for details.
97
+
- From a python script with the function `micro_sam.sam_annotator.annotator_3d`. Check out [examples/sam_annotator_3d](https://github.com/computational-cell-analytics/micro-sam/blob/master/examples/sam_annotator_3d.py) for details.
The most important parts of the user interface are listed below. Most of these elements are the same as for [the 2d segmentation app](https://github.com/computational-cell-analytics/micro-sam#2d-segmentation).
102
+
1.The napari layers that contain the image, segmentation and prompts. Same as for [the 2d segmentation app](https://github.com/computational-cell-analytics/micro-sam#2d-segmentation) but without the `auto_segmentation` layer.
103
+
2.The prompt menu.
104
+
3.The menu for interactive segmentation.
105
+
4.The 3d segmentation menu. Pressing `Segment Volume` (or `v`) will extend the segmentation for the current object across the volume.
106
+
5.The menu for committing the segmentation.
95
107
96
108
Check out [this video](https://youtu.be/5Jo_CtIefTM) for an overview of the interactive 3d segmentation functionality.
97
109
98
110
### Tracking
99
111
112
+
The application for interactive tracking (of 2d data) can be started as follows:
113
+
- Via the command line with the command `micro_sam.annotator_tracking`. Run `micro_sam.annotator_tracking -h` for details.
114
+
- From a python script with the function `micro_sam.sam_annotator.annotator_tracking`. Check out [examples/sam_annotator_tracking](https://github.com/computational-cell-analytics/micro-sam/blob/master/examples/sam_annotator_tracking.py) for details.
The most important parts of the user interface are listed below. Most of these elements are the same as for [the 2d segmentation app](https://github.com/computational-cell-analytics/micro-sam#2d-segmentation).
119
+
1.The napari layers thaat contain the image, segmentation and prompts. Same as for [the 2d segmentation app](https://github.com/computational-cell-analytics/micro-sam#2d-segmentation) but without the `auto_segmentation` layer, `current_tracks` and `committed_tracks` are the equivalent of `current_object` and `committed_objects`.
120
+
2.The prompt menu.
121
+
3.The menu with tracking settings: `track_state` is used to indicate that the object you are tracking is dividing in the current frame. `track_id` is used to select which of the tracks after divsion you are following.
122
+
4.The menu for interactive segmentation.
123
+
5.The tracking menu. Press `Track Object` (or `v`) to track the current object across time.
124
+
6.The menu for committing the current tracking result.
0 commit comments