Skip to content

Commit 21c3421

Browse files
committed
revert portal tutorial
1 parent 24d8c2d commit 21c3421

File tree

1 file changed

+33
-14
lines changed

1 file changed

+33
-14
lines changed

articles/azure-monitor/logs/tutorial-logs-ingestion-portal.md

Lines changed: 33 additions & 14 deletions
Original file line numberDiff line numberDiff line change
@@ -2,7 +2,7 @@
22
title: 'Tutorial: Send data to Azure Monitor Logs with Logs ingestion API (Azure portal)'
33
description: Tutorial on how sending data to a Log Analytics workspace in Azure Monitor using the Logs ingestion API. Supporting components configured using the Azure portal.
44
ms.topic: tutorial
5-
ms.date: 03/12/2024
5+
ms.date: 09/14/2023
66
author: bwren
77
ms.author: bwren
88

@@ -20,12 +20,11 @@ The [Logs Ingestion API](logs-ingestion-api-overview.md) in Azure Monitor allows
2020
The steps required to configure the Logs ingestion API are as follows:
2121

2222
1. [Create a Microsoft Entra application](#create-azure-ad-application) to authenticate against the API.
23+
3. [Create a data collection endpoint (DCE)](#create-data-collection-endpoint) to receive data.
2324
2. [Create a custom table in a Log Analytics workspace](#create-new-table-in-log-analytics-workspace). This is the table you'll be sending data to. As part of this process, you will create a data collection rule (DCR) to direct the data to the target table.
2425
5. [Give the AD application access to the DCR](#assign-permissions-to-the-dcr).
2526
6. [Use sample code to send data to using the Logs ingestion API](#send-sample-data).
2627

27-
> [!NOTE]
28-
> This article previously included a step to create a data collection endpoint (DCE). This is no longer required since [DCRs now include their own endpoint](../essentials/data-collection-endpoint-overview.md). A DCE is only required with Logs ingestion API if private link is used.
2928

3029
## Prerequisites
3130
To complete this tutorial, you need:
@@ -64,6 +63,26 @@ Start by registering a Microsoft Entra application to authenticate against the A
6463

6564
:::image type="content" source="media/tutorial-logs-ingestion-portal/new-app-secret-value.png" lightbox="media/tutorial-logs-ingestion-portal/new-app-secret-value.png" alt-text="Screenshot that shows the secret value for the new app.":::
6665

66+
## Create data collection endpoint
67+
68+
> [!NOTE]
69+
> A DCE is no longer required in most cases since you can use the endpoint of the DCR. The Azure portal though has not yet been updated to reflect this change and currently requires a DCE when you create the custom log. If you use [other methods to create the custom table and DCR](./tutorial-logs-ingestion-api.md), you can use the DCR endpoint instead of a DCE.
70+
71+
72+
A [data collection endpoint](../essentials/data-collection-endpoint-overview.md) is required to accept the data from the script. After you configure the DCE and link it to a DCR, you can send data over HTTP from your application. The DCE needs to be in the same region as the Log Analytics workspace where the data will be sent or the data collection rule being used.
73+
74+
1. To create a new DCE, go to the **Monitor** menu in the Azure portal. Select **Data Collection Endpoints** and then select **Create**.
75+
76+
:::image type="content" source="media/tutorial-logs-ingestion-portal/new-data-collection-endpoint.png" lightbox="media/tutorial-logs-ingestion-portal/new-data-collection-endpoint.png" alt-text="Screenshot that shows new DCE.":::
77+
78+
1. Provide a name for the DCE and ensure that it's in the same region as your workspace. Select **Create** to create the DCE.
79+
80+
:::image type="content" source="media/tutorial-logs-ingestion-portal/data-collection-endpoint-details.png" lightbox="media/tutorial-logs-ingestion-portal/data-collection-endpoint-details.png" alt-text="Screenshot that shows DCE details.":::
81+
82+
1. After the DCE is created, select it so that you can view its properties. Note the **Logs ingestion** URI because you'll need it in a later step.
83+
84+
:::image type="content" source="media/tutorial-logs-ingestion-portal/data-collection-endpoint-uri.png" lightbox="media/tutorial-logs-ingestion-portal/data-collection-endpoint-uri.png" alt-text="Screenshot that shows DCE URI.":::
85+
6786

6887
## Create new table in Log Analytics workspace
6988
Before you can send data to the workspace, you need to create the custom table where the data will be sent.
@@ -162,18 +181,18 @@ Instead of directly configuring the schema of the table, you can upload a file w
162181
:::image type="content" source="media/tutorial-logs-ingestion-portal/custom-log-create.png" lightbox="media/tutorial-logs-ingestion-portal/custom-log-create.png" alt-text="Screenshot that shows custom log create.":::
163182
164183
## Collect information from the DCR
165-
With the DCR created, you need to collect its ID and endpoint, which are needed in the API call.
184+
With the DCR created, you need to collect its ID, which is needed in the API call.
166185
167186
1. On the **Monitor** menu in the Azure portal, select **Data collection rules** and select the DCR you created. From **Overview** for the DCR, select **JSON View**.
168187
169188
:::image type="content" source="media/tutorial-logs-ingestion-portal/data-collection-rule-json-view.png" lightbox="media/tutorial-logs-ingestion-portal/data-collection-rule-json-view.png" alt-text="Screenshot that shows the DCR JSON view.":::
170189
171-
1. Copy the **immutableId** and **logsIngestion** values.
190+
1. Copy the **immutableId** value.
172191
173192
:::image type="content" source="media/tutorial-logs-ingestion-portal/data-collection-rule-immutable-id.png" lightbox="media/tutorial-logs-ingestion-portal/data-collection-rule-immutable-id.png" alt-text="Screenshot that shows collecting the immutable ID from the JSON view.":::
174193
175194
## Assign permissions to the DCR
176-
The final step is to give the application permission to use the DCR. Any application that uses the correct application ID and application key can now send data to Azure Monitor using the DCR.
195+
The final step is to give the application permission to use the DCR. Any application that uses the correct application ID and application key can now send data to the new DCE and DCR.
177196
178197
1. Select **Access Control (IAM)** for the DCR and then select **Add role assignment**.
179198
@@ -207,7 +226,7 @@ The following PowerShell script generates sample data to configure the custom ta
207226
1. Update the values of `$tenantId`, `$appId`, and `$appSecret` with the values you noted for **Directory (tenant) ID**, **Application (client) ID**, and secret **Value**. Then save it with the file name *LogGenerator.ps1*.
208227
209228
``` PowerShell
210-
param ([Parameter(Mandatory=$true)] $Log, $Type="file", $Output, $DcrImmutableId, $endpointURI, $Table)
229+
param ([Parameter(Mandatory=$true)] $Log, $Type="file", $Output, $DcrImmutableId, $DceURI, $Table)
211230
################
212231
##### Usage
213232
################
@@ -217,13 +236,13 @@ The following PowerShell script generates sample data to configure the custom ta
217236
# API call. Data will be written to a file by default.
218237
# [-Output <String>] - Path to resulting JSON sample
219238
# [-DcrImmutableId <string>] - DCR immutable ID
220-
# [-endpointURI] - Logs ingestion URI
239+
# [-DceURI] - Data collection endpoint URI
221240
# [-Table] - The name of the custom log table, including "_CL" suffix
222241
223242
224243
##### >>>> PUT YOUR VALUES HERE <<<<<
225244
# Information needed to authenticate to Azure Active Directory and obtain a bearer token
226-
$tenantId = "<put tenant ID here>"; #the tenant ID in which the DCR resides
245+
$tenantId = "<put tenant ID here>"; #the tenant ID in which the Data Collection Endpoint resides
227246
$appId = "<put application ID here>"; #the app ID created and granted permissions
228247
$appSecret = "<put secret value here>"; #the secret created for the above app - never store your secrets in the source code
229248
##### >>>> END <<<<<
@@ -262,8 +281,8 @@ The following PowerShell script generates sample data to configure the custom ta
262281
$DcrImmutableId = Read-Host "Enter DCR Immutable ID"
263282
};
264283
265-
if ($null -eq $endpointURI) {
266-
$endpointURI = Read-Host "Enter endpoint URI"
284+
if ($null -eq $DceURI) {
285+
$DceURI = Read-Host "Enter data collection endpoint URI"
267286
}
268287
269288
if ($null -eq $Table) {
@@ -289,7 +308,7 @@ The following PowerShell script generates sample data to configure the custom ta
289308
# Sending the data to Log Analytics via the DCR!
290309
$body = $log_entry | ConvertTo-Json -AsArray;
291310
$headers = @{"Authorization" = "Bearer $bearerToken"; "Content-Type" = "application/json" };
292-
$uri = "$endpointURI/dataCollectionRules/$DcrImmutableId/streams/Custom-$Table"+"?api-version=2023-01-01";
311+
$uri = "$DceURI/dataCollectionRules/$DcrImmutableId/streams/Custom-$Table"+"?api-version=2023-01-01";
293312
$uploadResponse = Invoke-RestMethod -Uri $uri -Method "Post" -Body $body -Headers $headers;
294313
295314
# Let's see how the response looks
@@ -314,10 +333,10 @@ The following PowerShell script generates sample data to configure the custom ta
314333
## Send sample data
315334
Allow at least 30 minutes for the configuration to take effect. You might also experience increased latency for the first few entries, but this activity should normalize.
316335
317-
1. Run the following command providing the values that you collected for your DCR. The script will start ingesting data by placing calls to the API at the pace of approximately one record per second.
336+
1. Run the following command providing the values that you collected for your DCR and DCE. The script will start ingesting data by placing calls to the API at the pace of approximately one record per second.
318337
319338
```PowerShell
320-
.\LogGenerator.ps1 -Log "sample_access.log" -Type "API" -Table "ApacheAccess_CL" -DcrImmutableId <immutable-id> -endpointURI <logs-ingestion-uri>
339+
.\LogGenerator.ps1 -Log "sample_access.log" -Type "API" -Table "ApacheAccess_CL" -DcrImmutableId <immutable ID> -DceUri <data collection endpoint URL>
321340
```
322341
323342
1. From Log Analytics, query your newly created table to verify that data arrived and that it's transformed properly.

0 commit comments

Comments
 (0)