Skip to content

Commit e9429f3

Browse files
agup006gitbook-bot
authored andcommitted
GITBOOK-4: No subject
1 parent bc8357c commit e9429f3

File tree

2 files changed

+6
-53
lines changed

2 files changed

+6
-53
lines changed

SUMMARY.md

Lines changed: 6 additions & 6 deletions
Original file line numberDiff line numberDiff line change
@@ -55,7 +55,7 @@
5555
* [Multiline Parsers](administration/configuring-fluent-bit/yaml/multiline-parsers-section.md)
5656
* [Pipeline](administration/configuring-fluent-bit/yaml/pipeline-section.md)
5757
* [Plugins](administration/configuring-fluent-bit/yaml/plugins-section.md)
58-
* [Upstream Servers](administration/configuring-fluent-bit/yaml/upstream-servers-section.md))
58+
* [Upstream Servers](administration/configuring-fluent-bit/yaml/upstream-servers-section.md)
5959
* [Environment Variables](administration/configuring-fluent-bit/yaml/environment-variables-section.md)
6060
* [Includes](administration/configuring-fluent-bit/yaml/includes-section.md)
6161
* [Classic mode](administration/configuring-fluent-bit/classic-mode/README.md)
@@ -174,8 +174,8 @@
174174
* [Amazon Kinesis Data Firehose](pipeline/outputs/firehose.md)
175175
* [Amazon Kinesis Data Streams](pipeline/outputs/kinesis.md)
176176
* [Amazon S3](pipeline/outputs/s3.md)
177-
* [Azure Blob](pipeline/outputs/azure\_blob.md)
178-
* [Azure Data Explorer](pipeline/outputs/azure\_kusto.md)
177+
* [Azure Blob](pipeline/outputs/azure_blob.md)
178+
* [Azure Data Explorer](pipeline/outputs/azure_kusto.md)
179179
* [Azure Log Analytics](pipeline/outputs/azure.md)
180180
* [Azure Logs Ingestion API](pipeline/outputs/azure_logs_ingestion.md)
181181
* [Counter](pipeline/outputs/counter.md)
@@ -195,12 +195,12 @@
195195
* [Kafka REST Proxy](pipeline/outputs/kafka-rest-proxy.md)
196196
* [LogDNA](pipeline/outputs/logdna.md)
197197
* [Loki](pipeline/outputs/loki.md)
198-
* [Microsoft Fabric](pipeline/outputs/azure\_kusto.md)
198+
* [Microsoft Fabric](pipeline/outputs/azure_kusto.md)
199199
* [NATS](pipeline/outputs/nats.md)
200200
* [New Relic](pipeline/outputs/new-relic.md)
201201
* [NULL](pipeline/outputs/null.md)
202202
* [Observe](pipeline/outputs/observe.md)
203-
* [OpenObserve](pipeline/inputs/openobserve.md)
203+
* [OpenObserve](pipeline/outputs/openobserve.md)
204204
* [OpenSearch](pipeline/outputs/opensearch.md)
205205
* [OpenTelemetry](pipeline/outputs/opentelemetry.md)
206206
* [Oracle Log Analytics](pipeline/outputs/oci-logging-analytics.md)
@@ -230,7 +230,7 @@
230230

231231
## Fluent Bit for Developers <a href="#development" id="development"></a>
232232

233-
* [C Library API](development/library\_api.md)
233+
* [C Library API](development/library_api.md)
234234
* [Ingest Records Manually](development/ingest-records-manually.md)
235235
* [Golang Output Plugins](development/golang-output-plugins.md)
236236
* [WASM Filter Plugins](development/wasm-filter-plugins.md)

pipeline/outputs/openobserve.md

Lines changed: 0 additions & 47 deletions
Original file line numberDiff line numberDiff line change
@@ -1,49 +1,2 @@
1-
---
2-
title: OpenObserve
3-
description: Send logs to OpenObserve using Fluent Bit
4-
---
5-
61
# OpenObserve
72

8-
Use the OpenObserve output plugin to ingest logs into [OpenObserve](https://openobserve.ai/).
9-
10-
Before you begin, you need an [OpenObserve account](https://cloud.openobserve.ai/), an
11-
`HTTP_User`, and an `HTTP_Passwd`. You can find these fields under **Ingestion** in
12-
OpenObserve Cloud. Alternatively, you can achieve this with various installation
13-
types as mentioned in the
14-
[OpenObserve documentation](https://openobserve.ai/docs/quickstart/)
15-
16-
## Configuration Parameters
17-
18-
| Key | Description | Default |
19-
| --------------- | ----------------------------------------------------------------------------------------------------------------------------------------------------------------- | -------------------------------- |
20-
| Host | Required. The OpenObserve server where you are sending logs. | `localhost` |
21-
| TLS | Required: Enable end-to-end security using TLS. Set to `on` to enable TLS communication with OpenObserve. | `on` |
22-
| compress | Recommended: Compresses the payload in GZIP format. OpenObserve supports and recommends setting this to `gzip` for optimized log ingestion. | _none_ |
23-
| HTTP_User | Required: Username for HTTP authentication. | _none_ |
24-
| HTTP_Passwd | Required: Password for HTTP authentication. | _none_ |
25-
| URI | Required: The API path used to send logs. | `/api/default/default/_json` |
26-
| Format | Required: The format of the log payload. OpenObserve expects JSON. | `json` |
27-
| json_date_key | Optional: The JSON key used for timestamps in the logs. | `timestamp` |
28-
| json_date_format | Optional: The format of the date in logs. OpenObserve supports ISO 8601. | `iso8601` |
29-
| include_tag_key | If `true`, a tag is appended to the output. The key name is used in the `tag_key` property. | `false` |
30-
31-
### Configuration File
32-
33-
Use this configuration file to get started:
34-
35-
```
36-
[OUTPUT]
37-
Name http
38-
Match *
39-
URI /api/default/default/_json
40-
Host localhost
41-
Port 5080
42-
tls on
43-
Format json
44-
Json_date_key timestamp
45-
Json_date_format iso8601
46-
HTTP_User <YOUR_HTTP_USER>
47-
HTTP_Passwd <YOUR_HTTP_PASSWORD>
48-
compress gzip
49-
```

0 commit comments

Comments
 (0)