Skip to content
Merged
Show file tree
Hide file tree
Changes from 1 commit
Commits
File filter

Filter by extension

Filter by extension

Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
48 changes: 24 additions & 24 deletions pipeline/outputs/azure.md
Original file line number Diff line number Diff line change
@@ -1,40 +1,40 @@
---
description: 'Send logs, metrics to Azure Log Analytics'
description: Send logs, metrics to Azure Log Analytics
---

# Azure Log Analytics

Azure output plugin allows to ingest your records into [Azure Log Analytics](https://azure.microsoft.com/en-us/services/log-analytics/) service.
Azure output plugin lets you ingest your records into [Azure Log Analytics](https://azure.microsoft.com/en-us/services/log-analytics/) service.

To get more details about how to setup Azure Log Analytics, please refer to the following documentation: [Azure Log Analytics](https://docs.microsoft.com/en-us/azure/log-analytics/)
For details about how to setup Azure Log Analytics, see the [Azure Log Analytics](https://docs.microsoft.com/en-us/azure/log-analytics/) documentation.

## Configuration Parameters
## Configuration parameters

| Key | Description | default |
| Key | Description | Default |
| :--- | :--- | :--- |
| Customer\_ID | Customer ID or WorkspaceID string. | |
| Shared\_Key | The primary or the secondary Connected Sources client authentication key. | |
| Log\_Type | The name of the event type. | fluentbit |
| Log_Type_Key | If included, the value for this key will be looked upon in the record and if present, will over-write the `log_type`. If not found then the `log_type` value will be used. | |
| Time\_Key | Optional parameter to specify the key name where the timestamp will be stored. | @timestamp |
| Time\_Generated | If enabled, the HTTP request header 'time-generated-field' will be included so Azure can override the timestamp with the key specified by 'time_key' option. | off |
| Workers | The number of [workers](../../administration/multithreading.md#outputs) to perform flush operations for this output. | `0` |
| `Customer_ID` | Customer ID or WorkspaceID string. | _none_ |
| `Shared_Key` | The primary or the secondary Connected Sources client authentication key. | _none_ |
| `Log_Type` | The name of the event type. | `fluentbit` |
| `Log_Type_Key` | If included, the value for this key checked in the record and if present, will overwrite the `log_type`. If not found then the `log_type` value will be used. | _none_ |
| `Time_Key` | Optional. Specify the key name where the timestamp will be stored. | `@timestamp` |
| `Time_Generated` | If enabled, the HTTP request header `time-generated-field` will be included so Azure can override the timestamp with the key specified by `time_key` option. | `off` |
| `Workers` | The number of [workers](../../administration/multithreading.md#outputs) to perform flush operations for this output. | `0` |

## Getting Started
## Get started

In order to insert records into an Azure Log Analytics instance, you can run the plugin from the command line or through the configuration file:
To insert records into an Azure Log Analytics instance, run the plugin from the command line or through the configuration file.

### Command Line
### Command line

The **azure** plugin, can read the parameters from the command line in two ways, through the **-p** argument \(property\), e.g:
The _Azure_ plugin can read the parameters from the command line in the following ways, using the `-p` argument (property):

```shell
fluent-bit -i cpu -o azure -p customer_id=abc -p shared_key=def -m '*' -f 1
```

### Configuration File
### Configuration file

In your main configuration file append the following _Input_ & _Output_ sections:
In your main configuration file append the following sections:

{% tabs %}
{% tab title="fluent-bit.yaml" %}
Expand All @@ -43,12 +43,12 @@ In your main configuration file append the following _Input_ & _Output_ sections
pipeline:
inputs:
- name: cpu

outputs:
- name: azure
match: '*'
customer_id: abc
shared_key: def
shared_key: def
```

{% endtab %}
Expand All @@ -68,7 +68,7 @@ pipeline:
{% endtab %}
{% endtabs %}

Another example using the `Log_Type_Key` with [record-accessor](https://docs.fluentbit.io/manual/administration/configuring-fluent-bit/classic-mode/record-accessor), which will read the table name (or event type) dynamically from kubernetes label `app`, instead of `Log_Type`:
The following example uses the `Log_Type_Key` with [record-accessor](https://docs.fluentbit.io/manual/administration/configuring-fluent-bit/classic-mode/record-accessor), which will read the table name (or event type) dynamically from the Kubernetes label `app`, instead of `Log_Type`:

{% tabs %}
{% tab title="fluent-bit.yaml" %}
Expand All @@ -77,13 +77,13 @@ Another example using the `Log_Type_Key` with [record-accessor](https://docs.flu
pipeline:
inputs:
- name: cpu

outputs:
- name: azure
match: '*'
log_type_key: $kubernetes['labels']['app']
customer_id: abc
shared_key: def
shared_key: def
```

{% endtab %}
Expand All @@ -102,4 +102,4 @@ pipeline:
```

{% endtab %}
{% endtabs %}
{% endtabs %}
1 change: 1 addition & 0 deletions vale-styles/FluentBit/Headings.yml
Original file line number Diff line number Diff line change
Expand Up @@ -14,6 +14,7 @@ exceptions:
- API
- APIs
- Azure
- Azure Log Analytics
- BuildKite
- CircleCI
- CLI
Expand Down