You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
Copy file name to clipboardExpand all lines: articles/azure-monitor/logs/tutorial-logs-ingestion-api.md
+177-1Lines changed: 177 additions & 1 deletion
Display the source diff
Display the rich diff
Original file line number
Diff line number
Diff line change
@@ -23,7 +23,7 @@ The steps required to configure the Logs ingestion API are as follows:
23
23
6. See [Sample code to send data to Azure Monitor using Logs ingestion API](tutorial-logs-ingestion-code.md) for sample code to send data to using the Logs ingestion API.
24
24
25
25
> [!NOTE]
26
-
> This article previously included a step to create a data collection endpoint (DCE). This is no longer required since [DCRs now include their own endpoint](../essentials/data-collection-endpoint-overview.md). A DCE is only required with Logs ingestion API if private link is used.
26
+
> This article includes options for using a DCR ingestion endpoint or a data collection endpoint (DCE). You can choose to user either one, but a DCE is required with Logs ingestion API if private link is used. See [When is a DCE required?](../essentials/data-collection-endpoint-overview.md#when-is-a-dce-required).
27
27
28
28
## Prerequisites
29
29
To complete this tutorial, you need:
@@ -64,6 +64,80 @@ Start by registering a Microsoft Entra application to authenticate against the A
64
64
65
65
:::image type="content" source="media/tutorial-logs-ingestion-portal/new-app-secret-value.png" lightbox="media/tutorial-logs-ingestion-portal/new-app-secret-value.png" alt-text="Screenshot that shows the secret value for the new app.":::
66
66
67
+
## Create data collection endpoint
68
+
69
+
## [DCR endpoint](#tab/dcr)
70
+
A DCE isn't required if you use the DCR ingestion endpoint.
71
+
72
+
## [DCE](#tab/dce)
73
+
74
+
A [DCE](../essentials/data-collection-endpoint-overview.md) is required to accept the data being sent to Azure Monitor. After you configure the DCE and link it to a DCR, you can send data over HTTP from your application. The DCE must be located in the same region as the DCR and the Log Analytics workspace where the data will be sent.
75
+
76
+
1. In the Azure portal's search box, enter **template** and then select **Deploy a custom template**.
77
+
78
+
:::image type="content" source="media/tutorial-workspace-transformations-api/deploy-custom-template.png" lightbox="media/tutorial-workspace-transformations-api/deploy-custom-template.png" alt-text="Screenshot that shows how to deploy a custom template.":::
79
+
80
+
1. Select **Build your own template in the editor**.
81
+
82
+
:::image type="content" source="media/tutorial-workspace-transformations-api/build-custom-template.png" lightbox="media/tutorial-workspace-transformations-api/build-custom-template.png" alt-text="Screenshot that shows how to build a template in the editor.":::
83
+
84
+
1. Paste the following ARM template into the editor and then select **Save**. You don't need to modify this template because you'll provide values for its parameters.
85
+
86
+
:::image type="content" source="media/tutorial-workspace-transformations-api/edit-template.png" lightbox="media/tutorial-workspace-transformations-api/edit-template.png" alt-text="Screenshot that shows how to edit an ARM template.":::
1. On the **Custom deployment** screen, specify a **Subscription** and **Resource group** to store the DCR and then provide values like a **Name** for the DCE. The **Location** should be the same location as the workspace. The **Region** will already be populated and will be used for the location of the DCE.
131
+
132
+
:::image type="content" source="media/tutorial-logs-ingestion-api/data-collection-endpoint-custom-deploy.png" lightbox="media/tutorial-logs-ingestion-api/data-collection-endpoint-custom-deploy.png" alt-text="Screenshot to edit custom deployment values.":::
133
+
134
+
1. Select **Review + create** and then select **Create** after you review the details.
135
+
136
+
1. Select **JSON View** to view other details for the DCE. Copy the **Resource ID** and the **logsIngestion endpoint** which you'll need in a later step.
137
+
138
+
:::image type="content" source="media/tutorial-logs-ingestion-api/data-collection-endpoint-json.png" lightbox="media/tutorial-logs-ingestion-api/data-collection-endpoint-json.png" alt-text="Screenshot that shows the DCE resource ID.":::
139
+
140
+
---
67
141
68
142
## Create new table in Log Analytics workspace
69
143
The custom table must be created before you can send data to it. The table for this tutorial will include five columns shown in the schema below. The `name`, `type`, and `description` properties are mandatory for each column. The properties `isHidden` and `isDefaultDisplay` both default to `false` if not explicitly specified. Possible data types are `string`, `int`, `long`, `real`, `boolean`, `dateTime`, `guid`, and `dynamic`.
@@ -140,6 +214,8 @@ The [DCR](../essentials/data-collection-rule-overview.md) defines how the data w
140
214
141
215
:::image type="content" source="media/tutorial-workspace-transformations-api/edit-template.png" lightbox="media/tutorial-workspace-transformations-api/edit-template.png" alt-text="Screenshot that shows how to edit an ARM template.":::
142
216
217
+
## [DCR endpoint](#tab/dcr)
218
+
143
219
Notice the following details in the DCR defined in this template:
144
220
145
221
- `streamDeclarations`: Column definitions of the incoming data.
@@ -183,6 +259,7 @@ The [DCR](../essentials/data-collection-rule-overview.md) defines how the data w
@@ -240,6 +317,105 @@ The [DCR](../essentials/data-collection-rule-overview.md) defines how the data w
240
317
}
241
318
}
242
319
```
320
+
321
+
## [DCE](#tab/dce)
322
+
323
+
Notice the following details in the DCR defined in this template:
324
+
325
+
- `dataCollectionEndpointId`: Resource ID of the data collection endpoint.
326
+
- `streamDeclarations`: Column definitions of the incoming data.
327
+
- `destinations`: Destination workspace.
328
+
- `dataFlows`: Matches the stream with the destination workspace and specifies the transformation query and the destination table. The output of the destination query is what will be sent to the destination table.
4. On the **Custom deployment** screen, specify a **Subscription** and **Resource group** to store the DCR. Then provide values defined in the template. The values include a **Name** for the DCR and the **Workspace Resource ID** that you collected in a previous step. The **Location** should be the same location as the workspace. The **Region** will already be populated and will be used for the location of the DCR.
0 commit comments