Skip to content

Commit 31b7a00

Browse files
Merge pull request #277773 from bwren/dce-fix
Add DCE back to logs ingestion
2 parents 3eaf455 + 00c2463 commit 31b7a00

File tree

2 files changed

+183
-7
lines changed

2 files changed

+183
-7
lines changed

articles/azure-monitor/logs/tutorial-logs-ingestion-api.md

Lines changed: 177 additions & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -23,7 +23,7 @@ The steps required to configure the Logs ingestion API are as follows:
2323
6. See [Sample code to send data to Azure Monitor using Logs ingestion API](tutorial-logs-ingestion-code.md) for sample code to send data to using the Logs ingestion API.
2424

2525
> [!NOTE]
26-
> This article previously included a step to create a data collection endpoint (DCE). This is no longer required since [DCRs now include their own endpoint](../essentials/data-collection-endpoint-overview.md). A DCE is only required with Logs ingestion API if private link is used.
26+
> This article includes options for using a DCR ingestion endpoint or a data collection endpoint (DCE). You can choose to user either one, but a DCE is required with Logs ingestion API if private link is used. See [When is a DCE required?](../essentials/data-collection-endpoint-overview.md#when-is-a-dce-required).
2727
2828
## Prerequisites
2929
To complete this tutorial, you need:
@@ -64,6 +64,80 @@ Start by registering a Microsoft Entra application to authenticate against the A
6464

6565
:::image type="content" source="media/tutorial-logs-ingestion-portal/new-app-secret-value.png" lightbox="media/tutorial-logs-ingestion-portal/new-app-secret-value.png" alt-text="Screenshot that shows the secret value for the new app.":::
6666

67+
## Create data collection endpoint
68+
69+
## [DCR endpoint](#tab/dcr)
70+
A DCE isn't required if you use the DCR ingestion endpoint.
71+
72+
## [DCE](#tab/dce)
73+
74+
A [DCE](../essentials/data-collection-endpoint-overview.md) is required to accept the data being sent to Azure Monitor. After you configure the DCE and link it to a DCR, you can send data over HTTP from your application. The DCE must be located in the same region as the DCR and the Log Analytics workspace where the data will be sent.
75+
76+
1. In the Azure portal's search box, enter **template** and then select **Deploy a custom template**.
77+
78+
:::image type="content" source="media/tutorial-workspace-transformations-api/deploy-custom-template.png" lightbox="media/tutorial-workspace-transformations-api/deploy-custom-template.png" alt-text="Screenshot that shows how to deploy a custom template.":::
79+
80+
1. Select **Build your own template in the editor**.
81+
82+
:::image type="content" source="media/tutorial-workspace-transformations-api/build-custom-template.png" lightbox="media/tutorial-workspace-transformations-api/build-custom-template.png" alt-text="Screenshot that shows how to build a template in the editor.":::
83+
84+
1. Paste the following ARM template into the editor and then select **Save**. You don't need to modify this template because you'll provide values for its parameters.
85+
86+
:::image type="content" source="media/tutorial-workspace-transformations-api/edit-template.png" lightbox="media/tutorial-workspace-transformations-api/edit-template.png" alt-text="Screenshot that shows how to edit an ARM template.":::
87+
88+
89+
```json
90+
{
91+
"$schema": "https://schema.management.azure.com/schemas/2019-04-01/deploymentTemplate.json#",
92+
"contentVersion": "1.0.0.0",
93+
"parameters": {
94+
"dataCollectionEndpointName": {
95+
"type": "string",
96+
"metadata": {
97+
"description": "Specifies the name of the Data Collection Endpoint to create."
98+
}
99+
},
100+
"location": {
101+
"type": "string",
102+
"defaultValue": "westus2",
103+
"metadata": {
104+
"description": "Specifies the location for the Data Collection Endpoint."
105+
}
106+
}
107+
},
108+
"resources": [
109+
{
110+
"type": "Microsoft.Insights/dataCollectionEndpoints",
111+
"name": "[parameters('dataCollectionEndpointName')]",
112+
"location": "[parameters('location')]",
113+
"apiVersion": "2021-04-01",
114+
"properties": {
115+
"networkAcls": {
116+
"publicNetworkAccess": "Enabled"
117+
}
118+
}
119+
}
120+
],
121+
"outputs": {
122+
"dataCollectionEndpointId": {
123+
"type": "string",
124+
"value": "[resourceId('Microsoft.Insights/dataCollectionEndpoints', parameters('dataCollectionEndpointName'))]"
125+
}
126+
}
127+
}
128+
```
129+
130+
1. On the **Custom deployment** screen, specify a **Subscription** and **Resource group** to store the DCR and then provide values like a **Name** for the DCE. The **Location** should be the same location as the workspace. The **Region** will already be populated and will be used for the location of the DCE.
131+
132+
:::image type="content" source="media/tutorial-logs-ingestion-api/data-collection-endpoint-custom-deploy.png" lightbox="media/tutorial-logs-ingestion-api/data-collection-endpoint-custom-deploy.png" alt-text="Screenshot to edit custom deployment values.":::
133+
134+
1. Select **Review + create** and then select **Create** after you review the details.
135+
136+
1. Select **JSON View** to view other details for the DCE. Copy the **Resource ID** and the **logsIngestion endpoint** which you'll need in a later step.
137+
138+
:::image type="content" source="media/tutorial-logs-ingestion-api/data-collection-endpoint-json.png" lightbox="media/tutorial-logs-ingestion-api/data-collection-endpoint-json.png" alt-text="Screenshot that shows the DCE resource ID.":::
139+
140+
---
67141

68142
## Create new table in Log Analytics workspace
69143
The custom table must be created before you can send data to it. The table for this tutorial will include five columns shown in the schema below. The `name`, `type`, and `description` properties are mandatory for each column. The properties `isHidden` and `isDefaultDisplay` both default to `false` if not explicitly specified. Possible data types are `string`, `int`, `long`, `real`, `boolean`, `dateTime`, `guid`, and `dynamic`.
@@ -140,6 +214,8 @@ The [DCR](../essentials/data-collection-rule-overview.md) defines how the data w
140214

141215
:::image type="content" source="media/tutorial-workspace-transformations-api/edit-template.png" lightbox="media/tutorial-workspace-transformations-api/edit-template.png" alt-text="Screenshot that shows how to edit an ARM template.":::
142216

217+
## [DCR endpoint](#tab/dcr)
218+
143219
Notice the following details in the DCR defined in this template:
144220

145221
- `streamDeclarations`: Column definitions of the incoming data.
@@ -183,6 +259,7 @@ The [DCR](../essentials/data-collection-rule-overview.md) defines how the data w
183259
"location": "[parameters('location')]",
184260
"apiVersion": "2021-09-01-preview",
185261
"properties": {
262+
"dataCollectionEndpointId": "[parameters('endpointResourceId')]",
186263
"streamDeclarations": {
187264
"Custom-MyTableRawData": {
188265
"columns": [
@@ -240,6 +317,105 @@ The [DCR](../essentials/data-collection-rule-overview.md) defines how the data w
240317
}
241318
}
242319
```
320+
321+
## [DCE](#tab/dce)
322+
323+
Notice the following details in the DCR defined in this template:
324+
325+
- `dataCollectionEndpointId`: Resource ID of the data collection endpoint.
326+
- `streamDeclarations`: Column definitions of the incoming data.
327+
- `destinations`: Destination workspace.
328+
- `dataFlows`: Matches the stream with the destination workspace and specifies the transformation query and the destination table. The output of the destination query is what will be sent to the destination table.
329+
330+
```json
331+
{
332+
"$schema": "https://schema.management.azure.com/schemas/2023-03-11/deploymentTemplate.json#",
333+
"contentVersion": "1.0.0.0",
334+
"parameters": {
335+
"dataCollectionRuleName": {
336+
"type": "string",
337+
"metadata": {
338+
"description": "Specifies the name of the Data Collection Rule to create."
339+
}
340+
},
341+
"location": {
342+
"type": "string",
343+
"metadata": {
344+
"description": "Specifies the location in which to create the Data Collection Rule."
345+
}
346+
},
347+
"workspaceResourceId": {
348+
"type": "string",
349+
"metadata": {
350+
"description": "Specifies the Azure resource ID of the Log Analytics workspace to use."
351+
}
352+
}
353+
},
354+
"resources": [
355+
{
356+
"type": "Microsoft.Insights/dataCollectionRules",
357+
"name": "[parameters('dataCollectionRuleName')]",
358+
"location": "[parameters('location')]",
359+
"apiVersion": "2021-09-01-preview",
360+
"properties": {
361+
"streamDeclarations": {
362+
"Custom-MyTableRawData": {
363+
"columns": [
364+
{
365+
"name": "Time",
366+
"type": "datetime"
367+
},
368+
{
369+
"name": "Computer",
370+
"type": "string"
371+
},
372+
{
373+
"name": "AdditionalContext",
374+
"type": "string"
375+
},
376+
{
377+
"name": "CounterName",
378+
"type": "string"
379+
},
380+
{
381+
"name": "CounterValue",
382+
"type": "real"
383+
}
384+
]
385+
}
386+
},
387+
"destinations": {
388+
"logAnalytics": [
389+
{
390+
"workspaceResourceId": "[parameters('workspaceResourceId')]",
391+
"name": "myworkspace"
392+
}
393+
]
394+
},
395+
"dataFlows": [
396+
{
397+
"streams": [
398+
"Custom-MyTableRawData"
399+
],
400+
"destinations": [
401+
"myworkspace"
402+
],
403+
"transformKql": "source | extend jsonContext = parse_json(AdditionalContext) | project TimeGenerated = Time, Computer, AdditionalContext = jsonContext, CounterName=tostring(jsonContext.CounterName), CounterValue=toreal(jsonContext.CounterValue)",
404+
"outputStream": "Custom-MyTable_CL"
405+
}
406+
]
407+
}
408+
}
409+
],
410+
"outputs": {
411+
"dataCollectionRuleId": {
412+
"type": "string",
413+
"value": "[resourceId('Microsoft.Insights/dataCollectionRules', parameters('dataCollectionRuleName'))]"
414+
}
415+
}
416+
}
417+
```
418+
---
243419

244420
4. On the **Custom deployment** screen, specify a **Subscription** and **Resource group** to store the DCR. Then provide values defined in the template. The values include a **Name** for the DCR and the **Workspace Resource ID** that you collected in a previous step. The **Location** should be the same location as the workspace. The **Region** will already be populated and will be used for the location of the DCR.
245421

articles/azure-monitor/logs/tutorial-logs-ingestion-code.md

Lines changed: 6 additions & 6 deletions
Original file line numberDiff line numberDiff line change
@@ -43,7 +43,7 @@ The following script uses the [Azure Monitor Ingestion client library for .NET](
4343
using Azure.Monitor.Ingestion;
4444
4545
// Initialize variables
46-
var endpoint = new Uri("https://logs-ingestion-rzmk.eastus2-1.ingest.monitor.azure.com");
46+
var endpoint = new Uri("https://my-url.monitor.azure.com");
4747
var ruleId = "dcr-00000000000000000000000000000000";
4848
var streamName = "Custom-MyTableRawData";
4949
@@ -160,7 +160,7 @@ The following sample code uses the [Azure Monitor Ingestion Logs client module f
160160
)
161161
162162
// logs ingestion URI
163-
const endpoint = "https://logs-ingestion-rzmk.eastus2-1.ingest.monitor.azure.com"
163+
const endpoint = "https://my-url.monitor.azure.com"
164164
// data collection rule (DCR) immutable ID
165165
const ruleID = "dcr-00000000000000000000000000000000"
166166
// stream name in the DCR that represents the destination table
@@ -257,7 +257,7 @@ The following sample code uses the [Azure Monitor Ingestion client library for J
257257
public static void main(String[] args) {
258258
259259
LogsIngestionClient client = new LogsIngestionClientBuilder()
260-
.endpoint("https://logs-ingestion-rzmk.eastus2-1.ingest.monitor.azure.com")
260+
.endpoint("https://my-url.monitor.azure.com")
261261
.credential(new DefaultAzureCredentialBuilder().build())
262262
.buildClient();
263263
@@ -325,7 +325,7 @@ The following sample code uses the [Azure Monitor Ingestion client library for J
325325
require("dotenv").config();
326326
327327
async function main() {
328-
const logsIngestionEndpoint = "https://logs-ingestion-rzmk.eastus2-1.ingest.monitor.azure.com";
328+
const logsIngestionEndpoint = "https://my-url.monitor.azure.com";
329329
const ruleId = "dcr-00000000000000000000000000000000";
330330
const streamName = "Custom-MyTableRawData";
331331
const credential = new DefaultAzureCredential();
@@ -403,7 +403,7 @@ The following PowerShell code sends data to the endpoint by using HTTP REST fund
403403
$appSecret = "0000000000000000000000000000000000000000" #Secret created for the application
404404
405405
# information needed to send data to the DCR endpoint
406-
$endpoint_uri = "https://logs-ingestion-rzmk.eastus2-1.ingest.monitor.azure.com" #Logs ingestion URI for the DCR
406+
$endpoint_uri = "https://my-url.monitor.azure.com" #Logs ingestion URI for the DCR
407407
$dcrImmutableId = "dcr-00000000000000000000000000000000" #the immutableId property of the DCR object
408408
$streamName = "Custom-MyTableRawData" #name of the stream in the DCR that represents the destination table
409409
@@ -484,7 +484,7 @@ The following sample code uses the [Azure Monitor Ingestion client library for P
484484
485485
```python
486486
# information needed to send data to the DCR endpoint
487-
endpoint_uri = "https://logs-ingestion-rzmk.eastus2-1.ingest.monitor.azure.com" # logs ingestion endpoint of the DCR
487+
endpoint_uri = "https://my-url.monitor.azure.com" # logs ingestion endpoint of the DCR
488488
dcr_immutableid = "dcr-00000000000000000000000000000000" # immutableId property of the Data Collection Rule
489489
stream_name = "Custom-MyTableRawData" #name of the stream in the DCR that represents the destination table
490490

0 commit comments

Comments
 (0)