Skip to content

Commit 4861fac

Browse files
authored
Update connect-logstash-data-connection-rules.md
Update article with cwatson-cat suggestions and remarks
1 parent e1235ba commit 4861fac

File tree

1 file changed

+5
-5
lines changed

1 file changed

+5
-5
lines changed

articles/sentinel/connect-logstash-data-connection-rules.md

Lines changed: 5 additions & 5 deletions
Original file line numberDiff line numberDiff line change
@@ -14,7 +14,7 @@ ms.author: lwainstein
1414
1515
Microsoft Sentinel's new Logstash output plugin supports pipeline transformations and advanced configuration via Data Collection Rules (DCRs). The plugin forwards any type of logs from external data sources into custom or standard tables in Log Analytics or Microsoft Sentinel.
1616

17-
In this article, you learn how to set up the new Logstash plugin to stream the data into Log Analytics using DCRs, with full control over the output schema. Learn how to **[deploy the plugin](#deploy-the-microsoft-sentinel-log-analytics-output-plugin-in-logstash)**.
17+
In this article, you learn how to set up the new Logstash plugin to stream the data into Log Analytics or Microsoft Sentinel using DCRs, with full control over the output schema. Learn how to **[deploy the plugin](#deploy-the-microsoft-sentinel-output-plugin-in-logstash)**.
1818

1919
> [!NOTE]
2020
> A [previous version of the Logstash plugin](connect-logstash.md) allows you to connect data sources through Logstash via the Data Collection API.
@@ -49,7 +49,7 @@ The Logstash engine is comprised of three components:
4949
5050
The Microsoft Sentinel output plugin for Logstash sends JSON-formatted data to your Log Analytics workspace, using the Log Analytics Log Ingestion API. The data is ingested into custom logs or standard table.
5151

52-
- Learn more about the [Logs ingestion API](../azure-monitor/logs/logs-ingestion-api-overview).
52+
- Learn more about the [Logs ingestion API](../azure-monitor/logs/logs-ingestion-api-overview.md).
5353

5454
## Deploy the Microsoft Sentinel output plugin in Logstash
5555

@@ -122,7 +122,7 @@ input {
122122
The plugin writes ten records to a sample file named `sampleFile<epoch seconds>.json` in the configured path. For example: *c:\temp\sampleFile1648453501.json*.
123123
Here is part of a sample file that the plugin creates:
124124
125-
```
125+
```json
126126
[
127127
{
128128
"host": "logstashMachine",
@@ -174,7 +174,7 @@ In this scenario, you configure the Logstash input plugin to send syslog events
174174
175175
The plugin writes ten records to a sample file named `sampleFile<epoch seconds>.json` in the configured path. For example: *c:\temp\sampleFile1648453501.json*.
176176
Here is part of a sample file that the plugin creates:
177-
```
177+
```json
178178
[
179179
{
180180
"logsource": "logstashMachine",
@@ -252,7 +252,7 @@ Note that:
252252
- The `dataflows` property transforms the input to the Syslog table format, and sets the `outputStream` to `Microsoft-Syslog`.
253253
254254
255-
```
255+
```json
256256
{
257257
"$schema": "https://schema.management.azure.com/schemas/2019-04-01/deploymentTemplate.json#",
258258
"contentVersion": "1.0.0.0",

0 commit comments

Comments
 (0)