Skip to content

Commit e2a886d

Browse files
committed
Fix typo: programatically -> programmatically
1 parent a7ab2bd commit e2a886d

File tree

10 files changed

+11
-11
lines changed

10 files changed

+11
-11
lines changed

articles/ai-services/agents/how-to/tools/bing-grounding.md

Lines changed: 1 addition & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -92,4 +92,4 @@ print(f"Last run step detail: {run_steps_data}")
9292

9393
## Next steps
9494

95-
See [code samples](./bing-code-samples.md) for using the Grounding with Bing tool programatically.
95+
See [code samples](./bing-code-samples.md) for using the Grounding with Bing tool programmatically.

articles/ai-services/agents/how-to/tools/code-interpreter-samples.md

Lines changed: 1 addition & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -21,7 +21,7 @@ Azure AI Agents supports using the Code Interpreter tool, which allows an agent
2121

2222
## Using the code interpreter tool with an agent
2323

24-
You can add the code interpreter tool to an agent programatically using the code examples listed at the top of this article, or the [Azure AI Foundry portal](https://ai.azure.com/). If you want to use the portal:
24+
You can add the code interpreter tool to an agent programmatically using the code examples listed at the top of this article, or the [Azure AI Foundry portal](https://ai.azure.com/). If you want to use the portal:
2525

2626
1. In the **Agents** screen for your agent, scroll down the **Setup** pane on the right to **action**. Then select **Add**.
2727

articles/ai-services/agents/how-to/tools/fabric.md

Lines changed: 1 addition & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -44,7 +44,7 @@ You need to first build and publish a Fabric data agent and then connect your Fa
4444

4545
:::zone pivot="portal"
4646

47-
You can add the Microsoft Fabric tool to an agent programatically using the code examples listed at the top of this article, or the Azure AI Foundry portal. If you want to use the portal:
47+
You can add the Microsoft Fabric tool to an agent programmatically using the code examples listed at the top of this article, or the Azure AI Foundry portal. If you want to use the portal:
4848

4949
1. Navigate to the **Agents** screen for your agent in [Azure AI Foundry](https://ai.azure.com/), scroll down the Setup pane on the right to **knowledge**. Then select **Add**.
5050

articles/ai-services/openai/includes/batch/batch-python.md

Lines changed: 1 addition & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -226,7 +226,7 @@ If your batch jobs are so large that you're hitting the enqueued token limit eve
226226

227227
## Track batch job progress
228228

229-
Once you have created batch job successfully you can monitor its progress either in the Studio or programatically. When checking batch job progress we recommend waiting at least 60 seconds in between each status call.
229+
Once you have created batch job successfully you can monitor its progress either in the Studio or programmatically. When checking batch job progress we recommend waiting at least 60 seconds in between each status call.
230230

231231
```Python
232232
import time

articles/ai-services/openai/includes/batch/batch-rest.md

Lines changed: 1 addition & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -189,7 +189,7 @@ The default 500 max file limit per resource also applies to output files. Here y
189189

190190
## Track batch job progress
191191

192-
Once you have created batch job successfully you can monitor its progress either in the Studio or programatically. When checking batch job progress we recommend waiting at least 60 seconds in between each status call.
192+
Once you have created batch job successfully you can monitor its progress either in the Studio or programmatically. When checking batch job progress we recommend waiting at least 60 seconds in between each status call.
193193

194194
```http
195195
curl https://YOUR_RESOURCE_NAME.openai.azure.com/openai/batches/{batch_id}?api-version=2025-03-01-preview \

articles/machine-learning/how-to-use-mlflow-cli-runs.md

Lines changed: 2 additions & 2 deletions
Original file line numberDiff line numberDiff line change
@@ -204,7 +204,7 @@ mlflow.autolog()
204204

205205
## View metrics and artifacts in your workspace
206206

207-
The metrics and artifacts from MLflow logging are tracked in your workspace. You can view and access them in Azure Machine Learning studio or access them programatically via the MLflow SDK.
207+
The metrics and artifacts from MLflow logging are tracked in your workspace. You can view and access them in Azure Machine Learning studio or access them programmatically via the MLflow SDK.
208208

209209
To view metrics and artifacts in the studio:
210210

@@ -215,7 +215,7 @@ To view metrics and artifacts in the studio:
215215

216216
:::image type="content" source="media/how-to-log-view-metrics/metrics.png" alt-text="Screenshot of the metrics view that shows the list of metrics and the charts created from the metrics." lightbox="media/how-to-log-view-metrics/metrics.png":::
217217

218-
To access or query metrics, parameters, and artifacts programatically via the MLflow SDK, use [mlflow.get_run()](https://mlflow.org/docs/latest/python_api/mlflow.html#mlflow.get_run).
218+
To access or query metrics, parameters, and artifacts programmatically via the MLflow SDK, use [mlflow.get_run()](https://mlflow.org/docs/latest/python_api/mlflow.html#mlflow.get_run).
219219

220220
```python
221221
import mlflow

articles/machine-learning/v1/concept-azure-machine-learning-architecture.md

Lines changed: 1 addition & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -234,7 +234,7 @@ When you deploy a trained model in the designer, you can [deploy the model as a
234234

235235
#### Pipeline endpoints
236236

237-
Pipeline endpoints let you call your [ML Pipelines](#ml-pipelines) programatically via a REST endpoint. Pipeline endpoints let you automate your pipeline workflows.
237+
Pipeline endpoints let you call your [ML Pipelines](#ml-pipelines) programmatically via a REST endpoint. Pipeline endpoints let you automate your pipeline workflows.
238238

239239
A pipeline endpoint is a collection of published pipelines. This logical organization lets you manage and call multiple pipelines using the same endpoint. Each published pipeline in a pipeline endpoint is versioned. You can select a default pipeline for the endpoint, or specify a version in the REST call.
240240

articles/machine-learning/v1/how-to-designer-import-data.md

Lines changed: 1 addition & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -31,7 +31,7 @@ We recommend that you use [datasets](concept-data.md) to import data into the de
3131

3232
### Register a dataset
3333

34-
You can register existing datasets [programatically with the SDK](how-to-create-register-datasets.md#create-datasets-from-datastores) or [visually in Azure Machine Learning studio](how-to-connect-data-ui.md#create-data-assets).
34+
You can register existing datasets [programmatically with the SDK](how-to-create-register-datasets.md#create-datasets-from-datastores) or [visually in Azure Machine Learning studio](how-to-connect-data-ui.md#create-data-assets).
3535

3636
You can also register the output for any designer component as a dataset.
3737

articles/search/includes/quickstarts/full-text-javascript.md

Lines changed: 1 addition & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -593,7 +593,7 @@ console.log(`Index named ${index.name} has been created.`);
593593
594594
### Load documents
595595
596-
In Azure AI Search, documents are data structures that are both inputs to indexing and outputs from queries. You can push such data to the index or use an [indexer](/azure/search/search-indexer-overview). In this case, we'll programatically push the documents to the index.
596+
In Azure AI Search, documents are data structures that are both inputs to indexing and outputs from queries. You can push such data to the index or use an [indexer](/azure/search/search-indexer-overview). In this case, we'll programmatically push the documents to the index.
597597
598598
Document inputs might be rows in a database, blobs in Blob storage, or, as in this sample, JSON documents on disk. Similar to what we did with the `indexDefinition`, we also need to import `hotels.json` at the top of *index.js* so that the data can be accessed in our main function.
599599

articles/search/includes/quickstarts/full-text-typescript.md

Lines changed: 1 addition & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -529,7 +529,7 @@ console.log(`Index named ${index.name} has been created.`);
529529
530530
### Load documents
531531
532-
In Azure AI Search, documents are data structures that are both inputs to indexing and outputs from queries. You can push such data to the index or use an [indexer](/azure/search/search-indexer-overview). In this case, we'll programatically push the documents to the index.
532+
In Azure AI Search, documents are data structures that are both inputs to indexing and outputs from queries. You can push such data to the index or use an [indexer](/azure/search/search-indexer-overview). In this case, we'll programmatically push the documents to the index.
533533
534534
Document inputs might be rows in a database, blobs in Blob storage, or, as in this sample, JSON documents on disk. You can either download [hotels.json](https://github.com/Azure-Samples/azure-search-javascript-samples/blob/main/quickstart/hotels.json) or create your own *hotels.json* file with the following content:
535535

0 commit comments

Comments
 (0)