Skip to content

Commit b0bfd44

Browse files
authored
Merge pull request #263141 from PatrickFarley/comvis-updates
Comvis updates
2 parents 1df11c7 + 7e1b8c5 commit b0bfd44

Some content is hidden

Large Commits have some content hidden by default. Use the searchbox below for content that may be hidden.

42 files changed

+845
-763
lines changed

articles/ai-services/computer-vision/Tutorials/storage-lab-tutorial.md

Lines changed: 2 additions & 2 deletions
Original file line numberDiff line numberDiff line change
@@ -374,9 +374,9 @@ Now you have a way to view the images you uploaded. The next step is to do more
374374
<a name="Exercise5"></a>
375375
## Use Azure AI Vision to generate metadata
376376

377-
### Create a Vision resource
377+
### Create a Vision resource
378378

379-
You'll need to create a Vision resource for your Azure account; this resource manages your access to Azure's Azure AI Vision service.
379+
You'll need to create a Computer Vision resource for your Azure account; this resource manages your access to Azure's Azure AI Vision service.
380380

381381
1. Follow the instructions in [Create an Azure AI services resource](../../multi-service-resource.md?pivots=azportal) to create a multi-service resource or a Vision resource.
382382

articles/ai-services/computer-vision/concept-background-removal.md

Lines changed: 1 addition & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -53,7 +53,7 @@ It's important to note the limitations of background removal:
5353

5454
## Use the API
5555

56-
The background removal feature is available through the [Segment](https://centraluseuap.dev.cognitive.microsoft.com/docs/services/unified-vision-apis-public-preview-2023-02-01-preview/operations/63e6b6d9217d201194bbecbd) API (`imageanalysis:segment`). You can call this API through the REST API or the Vision SDK. See the [Background removal how-to guide](./how-to/background-removal.md) for more information.
56+
The background removal feature is available through the [Segment](https://centraluseuap.dev.cognitive.microsoft.com/docs/services/unified-vision-apis-public-preview-2023-02-01-preview/operations/63e6b6d9217d201194bbecbd) API (`imageanalysis:segment`). See the [Background removal how-to guide](./how-to/background-removal.md) for more information.
5757

5858
## Next steps
5959

articles/ai-services/computer-vision/concept-ocr.md

Lines changed: 76 additions & 215 deletions
Original file line numberDiff line numberDiff line change
@@ -35,236 +35,97 @@ The following JSON response illustrates what the Image Analysis 4.0 API returns
3535

3636
```json
3737
{
38+
"modelVersion": "2023-10-01",
3839
"metadata":
3940
{
4041
"width": 1000,
4142
"height": 945
4243
},
4344
"readResult":
4445
{
45-
"stringIndexType": "TextElements",
46-
"content": "You must be the change you\nWish to see in the world !\nEverything has its beauty , but\nnot everyone sees it !",
47-
"pages":
46+
"blocks":
4847
[
4948
{
50-
"height": 945,
51-
"width": 1000,
52-
"angle": -1.099,
53-
"pageNumber": 1,
54-
"words":
55-
[
56-
{
57-
"content": "You",
58-
"boundingBox": [253,268,301,267,304,318,256,318],
59-
"confidence": 0.998,
60-
"span": {"offset":0,"length":3}
61-
},
62-
{
63-
"content": "must",
64-
"boundingBox": [310,266,376,265,378,316,313,317],
65-
"confidence": 0.988,
66-
"span": {"offset":4,"length":4}
67-
},
68-
{
69-
"content": "be",
70-
"boundingBox": [385,264,426,264,428,314,388,316],
71-
"confidence": 0.928,
72-
"span": {"offset":9,"length":2}
73-
},
74-
{
75-
"content": "the",
76-
"boundingBox": [435,263,494,263,496,311,437,314],
77-
"confidence": 0.997,
78-
"span": {"offset":12,"length":3}
79-
},
80-
{
81-
"content": "change",
82-
"boundingBox": [503,263,600,262,602,306,506,311],
83-
"confidence": 0.995,
84-
"span": {"offset":16,"length":6}
85-
},
86-
{
87-
"content": "you",
88-
"boundingBox": [609,262,665,263,666,302,611,305],
89-
"confidence": 0.998,
90-
"span": {"offset":23,"length":3}
91-
},
92-
{
93-
"content": "Wish",
94-
"boundingBox": [327,348,391,343,392,380,328,382],
95-
"confidence": 0.98,
96-
"span": {"offset":27,"length":4}
97-
},
98-
{
99-
"content": "to",
100-
"boundingBox": [406,342,438,340,439,378,407,379],
101-
"confidence": 0.997,
102-
"span": {"offset":32,"length":2}
103-
},
104-
{
105-
"content": "see",
106-
"boundingBox": [446,340,492,337,494,376,447,378],
107-
"confidence": 0.998,
108-
"span": {"offset":35,"length":3}
109-
},
110-
{
111-
"content": "in",
112-
"boundingBox": [500,337,527,336,529,375,501,376],
113-
"confidence": 0.983,
114-
"span": {"offset":39,"length":2}
115-
},
116-
{
117-
"content": "the",
118-
"boundingBox": [534,336,588,334,590,373,536,375],
119-
"confidence": 0.993,
120-
"span": {"offset":42,"length":3}
121-
},
122-
{
123-
"content": "world",
124-
"boundingBox": [599,334,655,333,658,371,601,373],
125-
"confidence": 0.998,
126-
"span": {"offset":46,"length":5}
127-
},
128-
{
129-
"content": "!",
130-
"boundingBox": [663,333,687,333,690,370,666,371],
131-
"confidence": 0.915,
132-
"span": {"offset":52,"length":1}
133-
},
134-
{
135-
"content": "Everything",
136-
"boundingBox": [255,446,371,441,372,490,256,494],
137-
"confidence": 0.97,
138-
"span": {"offset":54,"length":10}
139-
},
140-
{
141-
"content": "has",
142-
"boundingBox": [380,441,421,440,421,488,381,489],
143-
"confidence": 0.793,
144-
"span": {"offset":65,"length":3}
145-
},
146-
{
147-
"content": "its",
148-
"boundingBox": [430,440,471,439,471,487,431,488],
149-
"confidence": 0.998,
150-
"span": {"offset":69,"length":3}
151-
},
152-
{
153-
"content": "beauty",
154-
"boundingBox": [480,439,552,439,552,485,481,487],
155-
"confidence": 0.296,
156-
"span": {"offset":73,"length":6}
157-
},
158-
{
159-
"content": ",",
160-
"boundingBox": [561,439,571,439,571,485,562,485],
161-
"confidence": 0.742,
162-
"span": {"offset":80,"length":1}
163-
},
164-
{
165-
"content": "but",
166-
"boundingBox": [580,439,636,439,636,485,580,485],
167-
"confidence": 0.885,
168-
"span": {"offset":82,"length":3}
169-
},
170-
{
171-
"content": "not",
172-
"boundingBox": [364,516,412,512,413,546,366,549],
173-
"confidence": 0.994,
174-
"span": {"offset":86,"length":3}
175-
},
176-
{
177-
"content": "everyone",
178-
"boundingBox": [422,511,520,504,521,540,423,545],
179-
"confidence": 0.993,
180-
"span": {"offset":90,"length":8}
181-
},
182-
{
183-
"content": "sees",
184-
"boundingBox": [530,503,586,500,588,538,531,540],
185-
"confidence": 0.988,
186-
"span": {"offset":99,"length":4}
187-
},
188-
{
189-
"content": "it",
190-
"boundingBox": [596,500,627,498,628,536,598,537],
191-
"confidence": 0.998,
192-
"span": {"offset":104,"length":2}
193-
},
194-
{
195-
"content": "!",
196-
"boundingBox": [634,498,657,497,659,536,635,536],
197-
"confidence": 0.994,
198-
"span": {"offset":107,"length":1}
199-
}
200-
],
201-
"spans":
202-
[
203-
{
204-
"offset": 0,
205-
"length": 108
206-
}
207-
],
20849
"lines":
20950
[
21051
{
211-
"content": "You must be the change you",
212-
"boundingBox": [253,267,670,262,671,307,254,318],
213-
"spans": [{"offset":0,"length":26}]
214-
},
215-
{
216-
"content": "Wish to see in the world !",
217-
"boundingBox": [326,343,691,332,693,369,327,382],
218-
"spans": [{"offset":27,"length":26}]
219-
},
220-
{
221-
"content": "Everything has its beauty , but",
222-
"boundingBox": [254,443,640,438,641,485,255,493],
223-
"spans": [{"offset":54,"length":31}]
224-
},
225-
{
226-
"content": "not everyone sees it !",
227-
"boundingBox": [364,512,658,496,660,534,365,549],
228-
"spans": [{"offset":86,"length":22}]
52+
"text": "You must be the change you",
53+
"boundingPolygon":
54+
[
55+
{"x":251,"y":265},
56+
{"x":673,"y":260},
57+
{"x":674,"y":308},
58+
{"x":252,"y":318}
59+
],
60+
"words":
61+
[
62+
{"text":"You","boundingPolygon":[{"x":252,"y":267},{"x":307,"y":265},{"x":307,"y":318},{"x":253,"y":318}],"confidence":0.996},
63+
{"text":"must","boundingPolygon":[{"x":318,"y":264},{"x":386,"y":263},{"x":387,"y":316},{"x":319,"y":318}],"confidence":0.99},
64+
{"text":"be","boundingPolygon":[{"x":396,"y":262},{"x":432,"y":262},{"x":432,"y":315},{"x":396,"y":316}],"confidence":0.891},
65+
{"text":"the","boundingPolygon":[{"x":441,"y":262},{"x":503,"y":261},{"x":503,"y":312},{"x":442,"y":314}],"confidence":0.994},
66+
{"text":"change","boundingPolygon":[{"x":513,"y":261},{"x":613,"y":262},{"x":613,"y":306},{"x":513,"y":311}],"confidence":0.99},
67+
{"text":"you","boundingPolygon":[{"x":623,"y":262},{"x":673,"y":263},{"x":673,"y":302},{"x":622,"y":305}],"confidence":0.994}
68+
]
69+
},
70+
{
71+
"text": "wish to see in the world !",
72+
"boundingPolygon":
73+
[
74+
{"x":325,"y":338},
75+
{"x":695,"y":328},
76+
{"x":696,"y":370},
77+
{"x":325,"y":381}
78+
],
79+
"words":
80+
[
81+
{"text":"wish","boundingPolygon":[{"x":325,"y":339},{"x":390,"y":337},{"x":391,"y":380},{"x":326,"y":381}],"confidence":0.992},
82+
{"text":"to","boundingPolygon":[{"x":406,"y":337},{"x":443,"y":335},{"x":443,"y":379},{"x":407,"y":380}],"confidence":0.995},
83+
{"text":"see","boundingPolygon":[{"x":451,"y":335},{"x":494,"y":334},{"x":494,"y":377},{"x":452,"y":379}],"confidence":0.996},
84+
{"text":"in","boundingPolygon":[{"x":502,"y":333},{"x":533,"y":332},{"x":534,"y":376},{"x":503,"y":377}],"confidence":0.996},
85+
{"text":"the","boundingPolygon":[{"x":542,"y":332},{"x":590,"y":331},{"x":590,"y":375},{"x":542,"y":376}],"confidence":0.995},
86+
{"text":"world","boundingPolygon":[{"x":599,"y":331},{"x":664,"y":329},{"x":664,"y":372},{"x":599,"y":374}],"confidence":0.995},
87+
{"text":"!","boundingPolygon":[{"x":672,"y":329},{"x":694,"y":328},{"x":694,"y":371},{"x":672,"y":372}],"confidence":0.957}
88+
]
89+
},
90+
{
91+
"text": "Everything has its beauty , but",
92+
"boundingPolygon":
93+
[
94+
{"x":254,"y":439},
95+
{"x":644,"y":433},
96+
{"x":645,"y":484},
97+
{"x":255,"y":488}
98+
],
99+
"words":
100+
[
101+
{"text":"Everything","boundingPolygon":[{"x":254,"y":442},{"x":379,"y":440},{"x":380,"y":486},{"x":257,"y":488}],"confidence":0.97},
102+
{"text":"has","boundingPolygon":[{"x":388,"y":440},{"x":435,"y":438},{"x":436,"y":485},{"x":389,"y":486}],"confidence":0.965},
103+
{"text":"its","boundingPolygon":[{"x":445,"y":438},{"x":485,"y":437},{"x":486,"y":485},{"x":446,"y":485}],"confidence":0.99},
104+
{"text":"beauty","boundingPolygon":[{"x":495,"y":437},{"x":567,"y":435},{"x":568,"y":485},{"x":496,"y":485}],"confidence":0.685},
105+
{"text":",","boundingPolygon":[{"x":577,"y":435},{"x":583,"y":435},{"x":583,"y":485},{"x":577,"y":485}],"confidence":0.939},
106+
{"text":"but","boundingPolygon":[{"x":589,"y":435},{"x":644,"y":434},{"x":644,"y":485},{"x":589,"y":485}],"confidence":0.628}
107+
]
108+
},
109+
{
110+
"text": "not everyone sees it !",
111+
"boundingPolygon":
112+
[
113+
{"x":363,"y":508},
114+
{"x":658,"y":493},
115+
{"x":659,"y":539},
116+
{"x":364,"y":552}
117+
],
118+
"words":
119+
[
120+
{"text":"not","boundingPolygon":[{"x":363,"y":510},{"x":412,"y":508},{"x":413,"y":548},{"x":365,"y":552}],"confidence":0.989},
121+
{"text":"everyone","boundingPolygon":[{"x":420,"y":507},{"x":521,"y":501},{"x":522,"y":542},{"x":421,"y":548}],"confidence":0.924},
122+
{"text":"sees","boundingPolygon":[{"x":536,"y":501},{"x":588,"y":498},{"x":589,"y":540},{"x":537,"y":542}],"confidence":0.987},
123+
{"text":"it","boundingPolygon":[{"x":597,"y":497},{"x":627,"y":495},{"x":628,"y":540},{"x":598,"y":540}],"confidence":0.995},
124+
{"text":"!","boundingPolygon":[{"x":635,"y":495},{"x":656,"y":494},{"x":657,"y":540},{"x":636,"y":540}],"confidence":0.952}
125+
]
229126
}
230127
]
231128
}
232-
],
233-
"styles":
234-
[
235-
{
236-
"isHandwritten": true,
237-
"spans":
238-
[
239-
{
240-
"offset": 0,
241-
"length": 26
242-
}
243-
],
244-
"confidence": 0.95
245-
},
246-
{
247-
"isHandwritten": true,
248-
"spans":
249-
[
250-
{
251-
"offset": 27,
252-
"length": 58
253-
}
254-
],
255-
"confidence": 1
256-
},
257-
{
258-
"isHandwritten": true,
259-
"spans":
260-
[
261-
{
262-
"offset": 86,
263-
"length": 22
264-
}
265-
],
266-
"confidence": 0.9
267-
}
268129
]
269130
}
270131
}

articles/ai-services/computer-vision/how-to/background-removal.md

Lines changed: 7 additions & 8 deletions
Original file line numberDiff line numberDiff line change
@@ -16,13 +16,15 @@ ms.custom: references_regions
1616

1717
This article demonstrates how to call the Image Analysis 4.0 API to segment an image. It also shows you how to parse the returned information.
1818

19+
> [!IMPORTANT]
20+
> Background removal is only available through direct REST API calls. It is not available through the SDKs.
21+
1922
## Prerequisites
2023

2124
This guide assumes you have successfully followed the steps mentioned in the [quickstart](../quickstarts-sdk/image-analysis-client-library-40.md) page. This means:
2225

2326
* You have <a href="https://portal.azure.com/#create/Microsoft.CognitiveServicesComputerVision" title="created a Vision resource" target="_blank">created a Vision resource </a> and obtained a key and endpoint URL.
24-
* If you're using the client SDK, you have the appropriate SDK package installed and you have a running quickstart application. You modify this quickstart application based on code examples here.
25-
* If you're using 4.0 REST API calls directly, you have successfully made a `curl.exe` call to the service (or used an alternative tool). You modify the `curl.exe` call based on the examples here.
27+
* You have successfully made a `curl.exe` call to the service (or used an alternative tool). You modify the `curl.exe` call based on the examples here.
2628

2729
The quickstart shows you how to extract visual features from an image, however, the concepts are similar to background removal. Therefore you benefit from starting from the quickstart and making modifications.
2830

@@ -31,13 +33,11 @@ The quickstart shows you how to extract visual features from an image, however,
3133
3234
## Authenticate against the service
3335

34-
To authenticate against the Image Analysis service, you need an Azure AI Vision key and endpoint URL.
36+
To authenticate against the Image Analysis service, you need a Computer Vision key and endpoint URL.
3537

3638
> [!TIP]
3739
> Don't include the key directly in your code, and never post it publicly. See the Azure AI services [security](../../security-features.md) article for more authentication options like [Azure Key Vault](../../use-key-vault.md).
3840
39-
The SDK example assumes that you defined the environment variables `VISION_KEY` and `VISION_ENDPOINT` with your key and endpoint.
40-
4141
<!--
4242
#### [C#](#tab/csharp)
4343
@@ -70,7 +70,7 @@ Where we used this helper function to read the value of an environment variable:
7070
#### [REST API](#tab/rest)
7171
-->
7272

73-
Authentication is done by adding the HTTP request header **Ocp-Apim-Subscription-Key** and setting it to your vision key. The call is made to the URL `https://<endpoint>/computervision/imageanalysis:segment?api-version=2023-02-01-preview`, where `<endpoint>` is your unique Azure AI Vision endpoint URL. See [Select a mode ](./background-removal.md#select-a-mode) section for another query string you add to this URL.
73+
Authentication is done by adding the HTTP request header **Ocp-Apim-Subscription-Key** and setting it to your vision key. The call is made to the URL `https://<endpoint>/computervision/imageanalysis:segment?api-version=2023-02-01-preview`, where `<endpoint>` is your unique Computer Vision endpoint URL. See [Select a mode ](./background-removal.md#select-a-mode) section for another query string you add to this URL.
7474

7575

7676
## Select the image to analyze
@@ -220,10 +220,9 @@ The following one-channel PNG image is the response for the `foregroundMatting`
220220

221221
The API returns an image the same size as the original for the `foregroundMatting` mode, but at most 16 megapixels (preserving image aspect ratio) for the `backgroundRemoval` mode.
222222

223-
224223
## Error codes
225224

226-
[!INCLUDE [Image Analysis Error Codes](../includes/image-analysis-error-codes-40.md)]
225+
[!INCLUDE [image-analysis-error-codes-40](../includes/image-analysis-error-codes-40.md)]
227226

228227
---
229228

0 commit comments

Comments
 (0)