Skip to content

Commit 4530851

Browse files
Update README
1 parent 36079a4 commit 4530851

File tree

1 file changed

+42
-26
lines changed

1 file changed

+42
-26
lines changed

README.md

Lines changed: 42 additions & 26 deletions
Original file line numberDiff line numberDiff line change
@@ -62,50 +62,66 @@ pip install -e .
6262

6363
## Usage
6464

65-
After the installation the three applications for interactive annotations can be started from the command line or within a python script:
66-
- **2d segmentation**: via the command `micro_sam.annotator_2d` or with the function `micro_sam.sam_annotator.annotator_2d` from python. Run `micro_sam.annotator_2d -h` or check out [examples/sam_annotator_2d](https://github.com/computational-cell-analytics/micro-sam/blob/master/examples/sam_annotator_2d.py) for details.
67-
- **3d segmentation**: via the command `micro_sam.annotator_3d` or with the function `micro_sam.sam_annotator.annotator_3d` from python. Run `micro_sam.annotator_3d -h` or check out [examples/sam_annotator_3d](https://github.com/computational-cell-analytics/micro-sam/blob/master/examples/sam_annotator_3d.py) for details.
68-
- **tracking**: via the command `micro_sam.annotator_tracking` or with the function `micro_sam.sam_annotator.annotator_tracking` from python. Run `micro_sam.annotator_tracking -h` or check out [examples/sam_annotator_tracking](https://github.com/computational-cell-analytics/micro-sam/blob/master/examples/sam_annotator_tracking.py) for details.
69-
70-
All three applications are built with napari. If you are not familiar with napari yet, start [here](https://napari.org/stable/tutorials/fundamentals/quick_start.html).
65+
After installing the `micro_sam` python application the three interactive annotation tools can be started from the command line or from a python script (see details below).
66+
They are built with napari to implement the viewer and user interaction. If you are not familiar with napari yet, [start here](https://napari.org/stable/tutorials/fundamentals/quick_start.html).
67+
To use the apps the functionality of [napari point layers](https://napari.org/stable/howtos/layers/points.html) and [napari labels layers](https://napari.org/stable/howtos/layers/labels.html) is of particular importance.
7168

7269
### 2D Segmentation
7370

71+
The application for 2d segmentation can be started in two ways:
72+
- Via the command line with the command `micro_sam.annotator_2d`. Run `micro_sam.annotator_2d -h` for details.
73+
- From a python script with the function `micro_sam.sam_annotator.annotator_2d`. Check out [examples/sam_annotator_2d](https://github.com/computational-cell-analytics/micro-sam/blob/master/examples/sam_annotator_2d.py) for details.
74+
75+
Below you can see the interface of the application for a cell segmentation example:
76+
7477
<img src="https://github.com/computational-cell-analytics/micro-sam/assets/4263537/90055f2f-f6f3-4224-ab3c-57e545c278bc" width="768">
7578

76-
GUI-Elements 3d annotation:
77-
1. TODO
78-
2. TODO
79-
3. TODO
80-
4. TODO
81-
5. TODO
79+
The most important parts of the user interface are:
80+
1. The napari layers that contain the image, segmentations and prompts:
81+
- `prompts`: point layer that is used to provide prompts to SegmentAnything. Positive prompts (green points) for marking the object you want to segment, negative prompts (red points) for marking the outside of the object.
82+
- `current_object`: label layer that contains the object you're currently segmenting.
83+
- `committed_objects`: label layer with the objects that have already been segmented.
84+
- `auto_segmentation`: label layer results from using SegmentAnything for automatic instance segmentation.
85+
- `raw`: image layer that shows the image data.
86+
2. The prompt menu for changing the currently selected point from positive to negative and vice versa. This can also be done by pressing `t`.
87+
3. The menu for automatic segmentation. Pressing `Segment All Objects` will run automatic segmentation (this can take few minutes if you are using a CPU). The results will be displayed in the `auto_segmentation` layer.
88+
4. The menu for interactive segmentation. Pressing `Segment Object` (or `s`) will run segmentation for the current prompts. The result is displayed in `current_object`
89+
5. The menu for commiting the segmentation. When pressing `Commit` (or `c`) the result from the selected layer (either `current_object` or `auto_segmentation`) will be transferred from the respective layer to `committed_objects`.
8290

83-
TODO link to tutorial video.
91+
Check out [this video](https://youtu.be/DfWE_XRcqN8) for an overview of the interactive 2d segmentation functionality.
8492

8593
### 3D Segmentation
8694

95+
The application for 3d segmentation can be started as follows:
96+
- Via the command line with the command `micro_sam.annotator_3d`. Run `micro_sam.annotator_3d -h` for details.
97+
- From a python script with the function `micro_sam.sam_annotator.annotator_3d`. Check out [examples/sam_annotator_3d](https://github.com/computational-cell-analytics/micro-sam/blob/master/examples/sam_annotator_3d.py) for details.
98+
8799
<img src="https://github.com/computational-cell-analytics/micro-sam/assets/4263537/3c35ba63-1b67-48df-9b11-94919bdc7c79" width="768">
88100

89-
GUI-Elements 3d annotation:
90-
1. TODO
91-
2. TODO
92-
3. TODO
93-
4. TODO
94-
5. TODO
101+
The most important parts of the user interface are listed below. Most of these elements are the same as for [the 2d segmentation app](https://github.com/computational-cell-analytics/micro-sam#2d-segmentation).
102+
1. The napari layers that contain the image, segmentation and prompts. Same as for [the 2d segmentation app](https://github.com/computational-cell-analytics/micro-sam#2d-segmentation) but without the `auto_segmentation` layer.
103+
2. The prompt menu.
104+
3. The menu for interactive segmentation.
105+
4. The 3d segmentation menu. Pressing `Segment Volume` (or `v`) will extend the segmentation for the current object across the volume.
106+
5. The menu for committing the segmentation.
95107

96108
Check out [this video](https://youtu.be/5Jo_CtIefTM) for an overview of the interactive 3d segmentation functionality.
97109

98110
### Tracking
99111

112+
The application for interactive tracking (of 2d data) can be started as follows:
113+
- Via the command line with the command `micro_sam.annotator_tracking`. Run `micro_sam.annotator_tracking -h` for details.
114+
- From a python script with the function `micro_sam.sam_annotator.annotator_tracking`. Check out [examples/sam_annotator_tracking](https://github.com/computational-cell-analytics/micro-sam/blob/master/examples/sam_annotator_tracking.py) for details.
115+
100116
<img src="https://github.com/computational-cell-analytics/micro-sam/assets/4263537/1fdffe3c-ff10-4d06-a1ba-9974a673b846" width="768">
101117

102-
GUI-Elements tracking:
103-
1. TODO
104-
2. TODO
105-
3. TODO
106-
4. TODO
107-
5. TODO
108-
6. TODO
118+
The most important parts of the user interface are listed below. Most of these elements are the same as for [the 2d segmentation app](https://github.com/computational-cell-analytics/micro-sam#2d-segmentation).
119+
1. The napari layers thaat contain the image, segmentation and prompts. Same as for [the 2d segmentation app](https://github.com/computational-cell-analytics/micro-sam#2d-segmentation) but without the `auto_segmentation` layer, `current_tracks` and `committed_tracks` are the equivalent of `current_object` and `committed_objects`.
120+
2. The prompt menu.
121+
3. The menu with tracking settings: `track_state` is used to indicate that the object you are tracking is dividing in the current frame. `track_id` is used to select which of the tracks after divsion you are following.
122+
4. The menu for interactive segmentation.
123+
5. The tracking menu. Press `Track Object` (or `v`) to track the current object across time.
124+
6. The menu for committing the current tracking result.
109125

110126
TODO link to video tutorial
111127

0 commit comments

Comments
 (0)