Skip to content

Commit 4561699

Browse files
edsipergitbook-bot
authored andcommitted
GITBOOK-5: No subject
1 parent e2a260d commit 4561699

File tree

2 files changed

+125
-2
lines changed

2 files changed

+125
-2
lines changed

SUMMARY.md

Lines changed: 3 additions & 2 deletions
Original file line numberDiff line numberDiff line change
@@ -1,6 +1,6 @@
11
# Table of contents
22

3-
* [Fluent Bit v2.0 Documentation](README.md)
3+
* [Fluent Bit v2.1 Documentation](README.md)
44

55
## About
66

@@ -68,7 +68,7 @@
6868
* [Monitoring](administration/monitoring.md)
6969
* [HTTP Proxy](administration/http-proxy.md)
7070
* [Troubleshooting](administration/troubleshooting.md)
71-
71+
7272
## Local Testing
7373

7474
* [Validating your Data and Structure](local-testing/validating-your-data-and-structure.md)
@@ -178,6 +178,7 @@
178178
* [Syslog](pipeline/outputs/syslog.md)
179179
* [TCP & TLS](pipeline/outputs/tcp-and-tls.md)
180180
* [Treasure Data](pipeline/outputs/treasure-data.md)
181+
* [Vivo Exporter](pipeline/outputs/vivo-exporter.md)
181182
* [WebSocket](pipeline/outputs/websocket.md)
182183

183184
## Stream Processing

pipeline/outputs/vivo-exporter.md

Lines changed: 122 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -0,0 +1,122 @@
1+
# Vivo Exporter
2+
3+
## Vivo Exporter
4+
5+
Vivo Exporter is an output plugin that exposes logs, metrics, and traces through an HTTP endpoint. This plugin aims to be used in conjunction with [Vivo project](https://github.com/calyptia/vivo) .
6+
7+
### Configuration Parameters
8+
9+
| Key | Description | Default |
10+
| ---------------------- | -------------------------------------------------------------------------------------------------------------------------------------- | ------- |
11+
| `empty_stream_on_read` | If enabled, when an HTTP client consumes the data from a stream, the stream content will be removed. | Off |
12+
| `stream_queue_size` | Specify the maximum queue size per stream. Each specific stream for logs, metrics and traces can hold up to `stream_queue_size` bytes. | 20M |
13+
14+
### Getting Started
15+
16+
```python
17+
[INPUT]
18+
name dummy
19+
tag events
20+
rate 2
21+
22+
[OUTPUT]
23+
name vivo_exporter
24+
empty_stream_on_read off
25+
stream_queue_size 20M
26+
```
27+
28+
### How it works
29+
30+
Vivo Exporter provides buffers that serve as streams for each telemetry data type, in this case, `logs`, `metrics`, and `traces`. Each buffer contains a fixed capacity in terms of size (20M by default). When the data arrives at a stream, it’s appended to the end. If the buffer is full, it removes the older entries to make room for new data.
31+
32+
The `data` that arrives is a `chunk`. A chunk is a group of events that belongs to the same type (logs, metrics or traces) and contains the same `tag`. Every chunk placed in a stream is assigned with an auto-incremented `id`.
33+
34+
#### Requesting data from the streams
35+
36+
By using a simple HTTP request, you can retrieve the data from the streams. The following are the endpoints available:
37+
38+
| endpoint | Description |
39+
| ---------- | ----------------------------------------------------------------------------------------------------------------------------- |
40+
| `/logs` | Exposes log events in JSON format. Each event contains a timestamp, metadata and the event content. |
41+
| `/metrics` | Exposes metrics events in JSON format. Each metric contains name, metadata, metric type and labels (dimensions). |
42+
| `/traces` | Exposes traces events in JSON format. Each trace contains a name, resource spans, spans, attributes, events information, etc. |
43+
44+
The example below will generate dummy log events which will be consuming by using `curl` HTTP command line client:
45+
46+
**Configure and start Fluent Bit**
47+
48+
```python
49+
[INPUT]
50+
name dummy
51+
tag events
52+
rate 2
53+
54+
[OUTPUT]
55+
name vivo_exporter
56+
57+
```
58+
59+
**Retrieve the data**
60+
61+
```bash
62+
curl -i http://127.0.0.1:2025/logs
63+
```
64+
65+
> We are using the `-i` curl option to print also the HTTP response headers.
66+
67+
Curl output would look like this:
68+
69+
```bash
70+
HTTP/1.1 200 OK
71+
Server: Monkey/1.7.0
72+
Date: Tue, 21 Mar 2023 16:42:28 GMT
73+
Transfer-Encoding: chunked
74+
Content-Type: application/json
75+
Vivo-Stream-Start-ID: 0
76+
Vivo-Stream-End-ID: 3
77+
78+
[[1679416945459254000,{"_tag":"events"}],{"message":"dummy"}]
79+
[[1679416945959398000,{"_tag":"events"}],{"message":"dummy"}]
80+
[[1679416946459271000,{"_tag":"events"}],{"message":"dummy"}]
81+
[[1679416946959943000,{"_tag":"events"}],{"message":"dummy"}]
82+
[[1679416947459806000,{"_tag":"events"}],{"message":"dummy"}]
83+
[[1679416947958777000,{"_tag":"events"}],{"message":"dummy"}]
84+
[[1679416948459391000,{"_tag":"events"}],{"message":"dummy"}]
85+
```
86+
87+
### Streams and IDs
88+
89+
As mentioned above, on each stream we buffer a `chunk` that contains N events, each chunk contains it own ID which is unique inside the stream.
90+
91+
When we receive the HTTP response, Vivo Exporter also reports the range of chunk IDs that were served in the response via the HTTP headers `Vivo-Stream-Start-ID` and `Vivo-Stream-End-ID`.
92+
93+
The values of these headers can be used by the client application to specify a range between IDs or set limits for the number of chunks to retrieve from the stream.
94+
95+
### Retrieve ranges and use limits
96+
97+
A client might be interested into always retrieve the latest chunks available and skip previous one that already processed. In a first request without any given range, Vivo Exporter will provide all the content that exists in the buffer for the specific stream, on that response the client might want to keep the last ID (Vivo-Stream-End-ID) that was received.
98+
99+
To query ranges or starting from specific chunks IDs, remember that they are incremental, you can use a mix of the following options:
100+
101+
| Query string option | Description |
102+
| ------------------- | --------------------------------------------------------------------------------------------------------------------------------------------------- |
103+
| `from` | Specify the first chunk ID that is desired to be retrieved. Note that if the `chunk` ID does not exists the next one in the queue will be provided. |
104+
| `to` | The last chunk ID is desired. If not found, the whole stream will be provided (starting from `from` if was set). |
105+
| `limit` | Limit the output to a specific number of chunks. The default value is `0`, which means: send everything. |
106+
107+
The following example specifies the range from chunk ID 1 to chunk ID 3 and only 1 chunk:
108+
109+
`curl -i "http://127.0.0.1:2025/logs?from=1&to=3&limit=1"` Output:
110+
111+
```bash
112+
HTTP/1.1 200 OK
113+
Server: Monkey/1.7.0
114+
Date: Tue, 21 Mar 2023 16:45:05 GMT
115+
Transfer-Encoding: chunked
116+
Content-Type: application/json
117+
Vivo-Stream-Start-ID: 1
118+
Vivo-Stream-End-ID: 1
119+
120+
[[1679416945959398000,{"_tag":"events"}],{"message":"dummy"}]
121+
[[1679416946459271000,{"_tag":"events"}],{"message":"dummy"}]
122+
```

0 commit comments

Comments
 (0)