Skip to content

Commit 82ed1e2

Browse files
authored
Merge pull request #1918 from fluent/lynettemiles/sc-136228/update-fluent-bit-docs-pipeline-outputs-azure
2 parents bd07152 + 66da244 commit 82ed1e2

File tree

2 files changed

+28
-29
lines changed

2 files changed

+28
-29
lines changed

pipeline/outputs/azure_logs_ingestion.md

Lines changed: 27 additions & 29 deletions
Original file line numberDiff line numberDiff line change
@@ -1,44 +1,42 @@
11
---
2-
description: 'Send logs to Azure Log Analytics using Logs Ingestion API with DCE and DCR'
2+
description: Send logs to Azure Log Analytics using Logs Ingestion API
33
---
44

55
# Azure Logs Ingestion API
66

7-
Azure Logs Ingestion plugin allows you to ingest your records using [Logs Ingestion API in Azure Monitor](https://learn.microsoft.com/en-us/azure/azure-monitor/logs/logs-ingestion-api-overview) to supported [Azure tables](https://learn.microsoft.com/en-us/azure/azure-monitor/logs/logs-ingestion-api-overview#supported-tables) or to [custom tables](https://learn.microsoft.com/en-us/azure/azure-monitor/logs/create-custom-table#create-a-custom-table) that you create.
7+
Azure Logs Ingestion plugin lets you ingest your records using [Logs Ingestion API in Azure Monitor](https://learn.microsoft.com/en-us/azure/azure-monitor/logs/logs-ingestion-api-overview) to supported [Azure tables](https://learn.microsoft.com/en-us/azure/azure-monitor/logs/logs-ingestion-api-overview#supported-tables) or to [custom tables](https://learn.microsoft.com/en-us/azure/azure-monitor/logs/create-custom-table#create-a-custom-table) that you create.
88

99
The Logs ingestion API requires the following components:
1010

1111
- A Data Collection Endpoint (DCE)
1212
- A Data Collection Rule (DCR) and
1313
- A Log Analytics Workspace
1414

15-
> Note: According to [this document](https://github.com/MicrosoftDocs/azure-docs/blob/main/articles/azure-monitor/logs/logs-ingestion-api-overview.md#components), all resources should be in the same region.
15+
To visualize the basic logs ingestion operation, see the following image:
1616

17-
To visualize basic Logs Ingestion operation, see the following image:
17+
![Log ingestion overview](../../.gitbook/assets/azure-logs-ingestion-overview.png)
1818

19-
![](../../.gitbook/assets/azure-logs-ingestion-overview.png)
20-
21-
To get more details about how to set up these components, please refer to the following documentations:
19+
To get more details about how to set up these components, refer to the following documentation:
2220

2321
- [Azure Logs Ingestion API](https://docs.microsoft.com/en-us/azure/log-analytics/)
2422
- [Send data to Azure Monitor Logs with Logs ingestion API (setup DCE, DCR and Log Analytics)](https://learn.microsoft.com/en-us/azure/azure-monitor/logs/tutorial-logs-ingestion-portal)
2523

26-
## Configuration Parameters
24+
## Configuration parameters
2725

2826
| Key | Description | Default |
2927
| :------------ | :------------------------- | :------ |
30-
| tenant\_id | _Required_ - The tenant ID of the AAD application. ||
31-
| client\_id | _Required_ - The client ID of the AAD application. ||
32-
| client\_secret| _Required_ - The client secret of the AAD application ([App Secret](https://docs.microsoft.com/en-us/azure/active-directory/develop/howto-create-service-principal-portal#option-2-create-a-new-application-secret)). ||
33-
| dce\_url | _Required_ - Data Collection Endpoint(DCE) URL. ||
34-
| dcr\_id | _Required_ - Data Collection Rule (DCR) immutable ID (see [this document](https://learn.microsoft.com/en-us/azure/azure-monitor/logs/tutorial-logs-ingestion-portal#collect-information-from-the-dcr) to collect the immutable id) ||
35-
| table\_name | _Required_ - The name of the custom log table (include the `_CL` suffix as well if applicable) ||
36-
| time\_key | _Optional_ - Specify the key name where the timestamp will be stored. | `@timestamp` |
37-
| time\_generated | _Optional_ - If enabled, will generate a timestamp and append it to JSON. The key name is set by the 'time_key' parameter. | `true` |
38-
| compress | _Optional_ - Enable HTTP payload gzip compression. | `true` |
39-
| workers | The number of [workers](../../administration/multithreading.md#outputs) to perform flush operations for this output. | `0` |
40-
41-
## Getting Started
28+
| `tenant_id` | The tenant ID of the Azure Active Directory (AAD) application. | _none_ |
29+
| `client_id` | The client ID of the AAD application. | _none_ |
30+
| `client_secret`| The client secret of the AAD application ([App Secret](https://docs.microsoft.com/en-us/azure/active-directory/develop/howto-create-service-principal-portal#option-2-create-a-new-application-secret)). | _none_ |
31+
| `dce_url` | Data Collection Endpoint(DCE) URL. | _none_ |
32+
| `dcr_id` | Data Collection Rule (DCR) [immutable ID](https://learn.microsoft.com/en-us/azure/azure-monitor/logs/tutorial-logs-ingestion-portal#collect-information-from-the-dcr). | _none_ |
33+
| `table_name` | The name of the custom log table (include the `_CL` suffix as well if applicable) | _none_ |
34+
| `time_key` | Optional. Specify the key name where the timestamp will be stored. | `@timestamp` |
35+
| `time_generated` | Optional. If enabled, will generate a timestamp and append it to JSON. The key name is set by the `time_key` parameter. | `true` |
36+
| `compress` | Optional. Enable HTTP payload gzip compression. | `true` |
37+
| `workers` | The number of [workers](../../administration/multithreading.md#outputs) to perform flush operations for this output. | `0` |
38+
39+
## Get started
4240

4341
To send records into an Azure Log Analytics using Logs Ingestion API the following resources needs to be created:
4442

@@ -47,11 +45,11 @@ To send records into an Azure Log Analytics using Logs Ingestion API the followi
4745
- Either an [Azure tables](https://learn.microsoft.com/en-us/azure/azure-monitor/logs/logs-ingestion-api-overview#supported-tables) or [custom tables](https://learn.microsoft.com/en-us/azure/azure-monitor/logs/create-custom-table#create-a-custom-table)
4846
- An app registration with client secrets (for DCR access).
4947

50-
You can follow [this guideline](https://learn.microsoft.com/en-us/azure/azure-monitor/logs/tutorial-logs-ingestion-portal) to set up the DCE, DCR, app registration and a custom table.
48+
Follow [this guideline](https://learn.microsoft.com/en-us/azure/azure-monitor/logs/tutorial-logs-ingestion-portal) to set up the DCE, DCR, app registration and a custom table.
5149

52-
### Configuration File
50+
### Configuration file
5351

54-
Use this configuration to quickly get started:
52+
Use this configuration file to get started:
5553

5654
{% tabs %}
5755
{% tab title="fluent-bit.yaml" %}
@@ -62,9 +60,9 @@ pipeline:
6260
- name: tail
6361
path: /path/to/your/sample.log
6462
tag: sample
65-
key: RawData
66-
67-
# Or use other plugins
63+
key: RawData
64+
65+
# Or use other plugins
6866
#- name: cpu
6967
# tag: sample
7068

@@ -73,12 +71,12 @@ pipeline:
7371
match: sample
7472
# Add a json key named "Application":"fb_log"
7573
add: Application fb_log
76-
74+
7775
outputs:
7876
# Enable this section to see your json-log format
7977
#- name: stdout
8078
# match: '*'
81-
79+
8280
- name: azure_logs_ingestion
8381
match: sample
8482
client_id: XXXXXXXX-xxxx-yyyy-zzzz-xxxxyyyyzzzzxyzz
@@ -135,4 +133,4 @@ pipeline:
135133
{% endtab %}
136134
{% endtabs %}
137135

138-
Set up your DCR transformation accordingly based on the json output from fluent-bit's pipeline (input, parser, filter, output).
136+
Set up your DCR transformation based on the JSON output from the Fluent Bit pipeline (input, parser, filter, output).

vale-styles/FluentBit/Headings.yml

Lines changed: 1 addition & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -15,6 +15,7 @@ exceptions:
1515
- API
1616
- APIs
1717
- Azure
18+
- Azure Logs Ingestion API
1819
- Azure Log Analytics
1920
- BuildKite
2021
- CircleCI

0 commit comments

Comments
 (0)