Skip to content

Commit 0238f24

Browse files
authored
Added support for batch segmentation (#116)
1 parent ffdb781 commit 0238f24

File tree

6 files changed

+510
-8
lines changed

6 files changed

+510
-8
lines changed

README.md

Lines changed: 2 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -40,7 +40,9 @@ The **segment-geospatial** package draws its inspiration from [segment-anything-
4040
- [Automatically generating object masks](https://samgeo.gishub.org/examples/automatic_mask_generator)
4141
- [Segmenting satellite imagery with input prompts](https://samgeo.gishub.org/examples/input_prompts)
4242
- [Segmenting imagery with text prompts](https://samgeo.gishub.org/examples/text_prompts)
43+
- [Batch segmentation with text prompts](https://samgeo.gishub.org/examples/text_prompts_batch)
4344
- [Using segment-geospatial with ArcGIS Pro](https://samgeo.gishub.org/examples/arcgis)
45+
- [Segmenting swimming pools with text prompts](https://samgeo.gishub.org/examples/swimming_pools)
4446

4547
## Demos
4648

Lines changed: 261 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -0,0 +1,261 @@
1+
{
2+
"cells": [
3+
{
4+
"cell_type": "markdown",
5+
"metadata": {},
6+
"source": [
7+
"# Batch segmentation with text prompts\n",
8+
"\n",
9+
"[![image](https://studiolab.sagemaker.aws/studiolab.svg)](https://studiolab.sagemaker.aws/import/github/opengeos/segment-geospatial/blob/main/docs/examples/text_prompts_batch.ipynb)\n",
10+
"[![image](https://img.shields.io/badge/Open-Planetary%20Computer-black?style=flat&logo=microsoft)](https://pccompute.westeurope.cloudapp.azure.com/compute/hub/user-redirect/git-pull?repo=https://github.com/opengeos/segment-geospatial&urlpath=lab/tree/segment-geospatial/docs/examples/text_prompts_batch.ipynb&branch=main)\n",
11+
"[![image](https://colab.research.google.com/assets/colab-badge.svg)](https://colab.research.google.com/github/opengeos/segment-geospatial/blob/main/docs/examples/text_prompts_batch.ipynb)\n",
12+
"\n",
13+
"This notebook shows how to generate object masks from text prompts with the Segment Anything Model (SAM). \n",
14+
"\n",
15+
"Make sure you use GPU runtime for this notebook. For Google Colab, go to `Runtime` -> `Change runtime type` and select `GPU` as the hardware accelerator. "
16+
]
17+
},
18+
{
19+
"cell_type": "markdown",
20+
"metadata": {},
21+
"source": [
22+
"## Install dependencies\n",
23+
"\n",
24+
"Uncomment and run the following cell to install the required dependencies."
25+
]
26+
},
27+
{
28+
"cell_type": "code",
29+
"execution_count": null,
30+
"metadata": {},
31+
"outputs": [],
32+
"source": [
33+
"# %pip install segment-geospatial groundingdino-py leafmap localtileserver"
34+
]
35+
},
36+
{
37+
"cell_type": "code",
38+
"execution_count": null,
39+
"metadata": {},
40+
"outputs": [],
41+
"source": [
42+
"import leafmap\n",
43+
"from samgeo import tms_to_geotiff, split_raster\n",
44+
"from samgeo.text_sam import LangSAM"
45+
]
46+
},
47+
{
48+
"cell_type": "markdown",
49+
"metadata": {},
50+
"source": [
51+
"## Create an interactive map"
52+
]
53+
},
54+
{
55+
"cell_type": "code",
56+
"execution_count": null,
57+
"metadata": {},
58+
"outputs": [],
59+
"source": [
60+
"m = leafmap.Map(center=[-22.1278, -51.4430], zoom=17, height=\"800px\")\n",
61+
"m.add_basemap(\"SATELLITE\")\n",
62+
"m"
63+
]
64+
},
65+
{
66+
"cell_type": "markdown",
67+
"metadata": {},
68+
"source": [
69+
"## Download a sample image\n",
70+
"\n",
71+
"Pan and zoom the map to select the area of interest. Use the draw tools to draw a polygon or rectangle on the map"
72+
]
73+
},
74+
{
75+
"cell_type": "code",
76+
"execution_count": null,
77+
"metadata": {},
78+
"outputs": [],
79+
"source": [
80+
"bbox = m.user_roi_bounds()\n",
81+
"if bbox is None:\n",
82+
" bbox = [-51.4494, -22.1307, -51.4371, -22.1244]"
83+
]
84+
},
85+
{
86+
"cell_type": "code",
87+
"execution_count": null,
88+
"metadata": {},
89+
"outputs": [],
90+
"source": [
91+
"image = \"Image.tif\"\n",
92+
"tms_to_geotiff(output=image, bbox=bbox, zoom=19, source=\"Satellite\", overwrite=True)"
93+
]
94+
},
95+
{
96+
"cell_type": "markdown",
97+
"metadata": {},
98+
"source": [
99+
"You can also use your own image. Uncomment and run the following cell to use your own image."
100+
]
101+
},
102+
{
103+
"cell_type": "code",
104+
"execution_count": null,
105+
"metadata": {},
106+
"outputs": [],
107+
"source": [
108+
"# image = '/path/to/your/own/image.tif'"
109+
]
110+
},
111+
{
112+
"cell_type": "markdown",
113+
"metadata": {},
114+
"source": [
115+
"Display the downloaded image on the map."
116+
]
117+
},
118+
{
119+
"cell_type": "code",
120+
"execution_count": null,
121+
"metadata": {},
122+
"outputs": [],
123+
"source": [
124+
"m.layers[-1].visible = False\n",
125+
"m.add_raster(image, layer_name=\"Image\")\n",
126+
"m"
127+
]
128+
},
129+
{
130+
"cell_type": "markdown",
131+
"metadata": {},
132+
"source": [
133+
"## Split the image into tiles"
134+
]
135+
},
136+
{
137+
"cell_type": "code",
138+
"execution_count": null,
139+
"metadata": {},
140+
"outputs": [],
141+
"source": [
142+
"split_raster(image, out_dir=\"tiles\", tile_size=(1000, 1000), overlap=0)"
143+
]
144+
},
145+
{
146+
"cell_type": "markdown",
147+
"metadata": {},
148+
"source": [
149+
"## Initialize LangSAM class\n",
150+
"\n",
151+
"The initialization of the LangSAM class might take a few minutes. The initialization downloads the model weights and sets up the model for inference."
152+
]
153+
},
154+
{
155+
"cell_type": "code",
156+
"execution_count": null,
157+
"metadata": {},
158+
"outputs": [],
159+
"source": [
160+
"sam = LangSAM()"
161+
]
162+
},
163+
{
164+
"cell_type": "markdown",
165+
"metadata": {},
166+
"source": [
167+
"## Specify text prompts"
168+
]
169+
},
170+
{
171+
"cell_type": "code",
172+
"execution_count": null,
173+
"metadata": {},
174+
"outputs": [],
175+
"source": [
176+
"text_prompt = \"tree\""
177+
]
178+
},
179+
{
180+
"cell_type": "markdown",
181+
"metadata": {},
182+
"source": [
183+
"## Segment images\n",
184+
"\n",
185+
"Part of the model prediction includes setting appropriate thresholds for object detection and text association with the detected objects. These threshold values range from 0 to 1 and are set while calling the predict method of the LangSAM class.\n",
186+
"\n",
187+
"`box_threshold`: This value is used for object detection in the image. A higher value makes the model more selective, identifying only the most confident object instances, leading to fewer overall detections. A lower value, conversely, makes the model more tolerant, leading to increased detections, including potentially less confident ones.\n",
188+
"\n",
189+
"`text_threshold`: This value is used to associate the detected objects with the provided text prompt. A higher value requires a stronger association between the object and the text prompt, leading to more precise but potentially fewer associations. A lower value allows for looser associations, which could increase the number of associations but also introduce less precise matches.\n",
190+
"\n",
191+
"Remember to test different threshold values on your specific data. The optimal threshold can vary depending on the quality and nature of your images, as well as the specificity of your text prompts. Make sure to choose a balance that suits your requirements, whether that's precision or recall."
192+
]
193+
},
194+
{
195+
"cell_type": "code",
196+
"execution_count": null,
197+
"metadata": {},
198+
"outputs": [],
199+
"source": [
200+
"sam.predict_batch(\n",
201+
" images='tiles', \n",
202+
" out_dir='masks', \n",
203+
" text_prompt=text_prompt, \n",
204+
" box_threshold=0.24, \n",
205+
" text_threshold=0.24,\n",
206+
" mask_multiplier=255, \n",
207+
" dtype='uint8',\n",
208+
" merge=True,\n",
209+
" verbose=True\n",
210+
" )"
211+
]
212+
},
213+
{
214+
"cell_type": "markdown",
215+
"metadata": {},
216+
"source": [
217+
"## Visualize the results"
218+
]
219+
},
220+
{
221+
"cell_type": "code",
222+
"execution_count": null,
223+
"metadata": {},
224+
"outputs": [],
225+
"source": [
226+
"m.add_raster('masks/merged.tif', cmap='viridis', nodata=0, layer_name='Mask')\n",
227+
"m.add_layer_manager()\n",
228+
"m"
229+
]
230+
},
231+
{
232+
"cell_type": "markdown",
233+
"metadata": {},
234+
"source": [
235+
"![](https://i.imgur.com/JUhNkm6.png)"
236+
]
237+
}
238+
],
239+
"metadata": {
240+
"kernelspec": {
241+
"display_name": "sam",
242+
"language": "python",
243+
"name": "python3"
244+
},
245+
"language_info": {
246+
"codemirror_mode": {
247+
"name": "ipython",
248+
"version": 3
249+
},
250+
"file_extension": ".py",
251+
"mimetype": "text/x-python",
252+
"name": "python",
253+
"nbconvert_exporter": "python",
254+
"pygments_lexer": "ipython3",
255+
"version": "3.9.16"
256+
},
257+
"orig_nbformat": 4
258+
},
259+
"nbformat": 4,
260+
"nbformat_minor": 2
261+
}

docs/index.md

Lines changed: 2 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -40,7 +40,9 @@ The **segment-geospatial** package draws its inspiration from [segment-anything-
4040
- [Automatically generating object masks](https://samgeo.gishub.org/examples/automatic_mask_generator)
4141
- [Segmenting satellite imagery with input prompts](https://samgeo.gishub.org/examples/input_prompts)
4242
- [Segmenting imagery with text prompts](https://samgeo.gishub.org/examples/text_prompts)
43+
- [Batch segmentation with text prompts](https://samgeo.gishub.org/examples/text_prompts_batch)
4344
- [Using segment-geospatial with ArcGIS Pro](https://samgeo.gishub.org/examples/arcgis)
45+
- [Segmenting swimming pools with text prompts](https://samgeo.gishub.org/examples/swimming_pools)
4446

4547
## Demos
4648

mkdocs.yml

Lines changed: 2 additions & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -53,8 +53,9 @@ nav:
5353
- examples/automatic_mask_generator.ipynb
5454
- examples/input_prompts.ipynb
5555
- examples/text_prompts.ipynb
56-
- examples/arcgis.ipynb
56+
- examples/text_prompts_batch.ipynb
5757
- examples/swimming_pools.ipynb
58+
- examples/arcgis.ipynb
5859
- API Reference:
5960
- common module: common.md
6061
- samgeo module: samgeo.md

0 commit comments

Comments
 (0)