Skip to content

Commit 24b9665

Browse files
committed
fix
1 parent b1dddec commit 24b9665

File tree

2 files changed

+4
-1
lines changed

2 files changed

+4
-1
lines changed

articles/machine-learning/concept-endpoints-online.md

Lines changed: 3 additions & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -9,7 +9,7 @@ ms.topic: concept-article
99
author: msakande
1010
ms.author: mopeakande
1111
ms.reviewer: sehan
12-
ms.custom: devplatv2
12+
ms.custom: devplatv2, FY25Q1-Linter
1313
ms.date: 09/23/2024
1414

1515
#Customer intent: As an ML pro, I want to understand what an online endpoint is and why I need it.
@@ -21,6 +21,8 @@ ms.date: 09/23/2024
2121

2222
This article describes online endpoints for real-time inferencing in Azure Machine Learning. Inferencing is the process of applying new input data to a machine learning model to generate outputs. Azure Machine Learning allows you to perform real-time inferencing on data by using models that are deployed to *online endpoints*. While these outputs are typically called *predictions*, you can use inferencing to generate outputs for other machine learning tasks, such as classification and clustering.
2323

24+
## Online endpoints
25+
2426
Online endpoints deploy models to a web server that can return predictions under the HTTP protocol. Online endpoints can operationalize models for real-time inference in synchronous, low-latency requests, and are best used when:
2527

2628
- You have low-latency requirements.

articles/machine-learning/how-to-troubleshoot-online-endpoints.md

Lines changed: 1 addition & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -159,6 +159,7 @@ ml_client.online_deployments.get_logs(
159159

160160
### [Studio](#tab/studio)
161161

162+
<a name="see-log-output-in-azure-machine-learning-studio"</a>
162163
To view log output from a container in Azure Machine Learning studio:
163164

164165
1. Select **Endpoints** in the left navigation bar.

0 commit comments

Comments
 (0)