You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
-**Configure the `transform/logs` processor**: In the `agent.yaml` apply the processor to `log_statements` in the `resource` context and retain only relevant resource attributes (`com.splunk.sourcetype`, `host.name`, `otelcol.service.mode`):
9
+
**Configure the `transform/logs` processor**: In the `agent.yaml` apply the processor to `log_statements` in the `resource` context and retain only relevant resource attributes (`com.splunk.sourcetype`, `host.name`, `otelcol.service.mode`):
This configuration ensures that only the specified attributes are retained, improving log efficiency and reducing unnecessary metadata.
18
20
19
-
This configuration ensures that only the specified attributes are retained, improving log efficiency and reducing unnecessary metadata.
20
-
- **Adding a Context Block for Log Severity Mapping**: To properly set the `severity_text` and `severity_number` fields of a log record, add another log context block within log_statements. This configuration extracts the `level` value from the log body, maps it to `severity_text`, and assigns the appropriate `severity_number`:
21
+
**Adding a Context Block for Log Severity Mapping**: To properly set the `severity_text` and `severity_number` fields of a log record, add another log context block within log_statements. This configuration extracts the `level` value from the log body, maps it to `severity_text`, and assigns the appropriate `severity_number`:
21
22
22
-
```yaml
23
-
- context: log # Log Context
24
-
statements: # Transform Statements Array
25
-
- set(cache, ParseJSON(body)) where IsMatch(body, "^\\{")
26
-
- flatten(cache, "")
27
-
- merge_maps(attributes, cache, "upsert")
28
-
- set(severity_text, attributes["level"])
29
-
- set(severity_number, 1) where severity_text == "TRACE"
30
-
- set(severity_number, 5) where severity_text == "DEBUG"
31
-
- set(severity_number, 9) where severity_text == "INFO"
32
-
- set(severity_number, 13) where severity_text == "WARN"
33
-
- set(severity_number, 17) where severity_text == "ERROR"
34
-
- set(severity_number, 21) where severity_text == "FATAL"
35
-
```
23
+
```yaml
24
+
- context: log # Log Context
25
+
statements: # Transform Statements Array
26
+
- set(cache, ParseJSON(body)) where IsMatch(body, "^\\{")
27
+
- flatten(cache, "")
28
+
- merge_maps(attributes, cache, "upsert")
29
+
- set(severity_text, attributes["level"])
30
+
- set(severity_number, 1) where severity_text == "TRACE"
31
+
- set(severity_number, 5) where severity_text == "DEBUG"
32
+
- set(severity_number, 9) where severity_text == "INFO"
33
+
- set(severity_number, 13) where severity_text == "WARN"
34
+
- set(severity_number, 17) where severity_text == "ERROR"
35
+
- set(severity_number, 21) where severity_text == "FATAL"
36
+
```
36
37
37
-
- **Summary of Key Transformations**:
38
-
- **Parse JSON**: Extracts structured data from the log body.
39
-
- **Flatten JSON**: Converts nested JSON objects into a flat structure.
40
-
- **Merge Attributes**: Integrates extracted data into log attributes.
41
-
- **Map Severity Text**: Assigns severity_text from the log’s level attribute.
This method of mapping all JSON fields to top-level attributes should only be used for **testing and debugging OTTL**. It will result in high cardinality in a production scenario.
48
50
{{% /notice %}}
49
51
50
-
- **Update the `logs` pipeline**: Add the `transform/logs:` processor into the `logs:` pipeline:
52
+
**Update the `logs` pipeline**: Add the `transform/logs:` processor into the `logs:` pipeline:
51
53
52
-
```yaml
53
-
logs: # Logs Pipeline
54
-
receivers: # Array of receivers in this pipeline
55
-
- filelog/quotes
56
-
- otlp
57
-
processors: # Array of Processors in this pipeline
58
-
- memory_limiter # You also could use [memory_limiter]
59
-
- resourcedetection
60
-
- resource/add_mode
61
-
- transform/logs
62
-
- batch
63
-
```
54
+
```yaml
55
+
logs: # Logs Pipeline
56
+
receivers: # Array of receivers in this pipeline
57
+
- filelog/quotes
58
+
- otlp
59
+
processors: # Array of Processors in this pipeline
60
+
- memory_limiter # You also could use [memory_limiter]
-**Start the Log Generator**: In the **Test** terminal window, navigate to the `[WORKSHOP]/7-transform-data` directory and start the appropriate `log-gen` script for your system. We want to work with structured JSON logs, so add the `-json` flag.
9
+
**Start the Log Generator**: In the **Test** terminal window, navigate to the `[WORKSHOP]/7-transform-data` directory and start the appropriate `log-gen` script for your system. We want to work with structured JSON logs, so add the `-json` flag.
10
10
11
-
```sh
12
-
./log-gen.sh -json
13
-
```
11
+
```sh
12
+
./log-gen.sh -json
13
+
```
14
14
15
-
The script will begin writing lines to a file named `./quotes.log`, while displaying a single line of output in the console.
15
+
The script will begin writing lines to a file named `./quotes.log`, while displaying a single line of output in the console.
16
16
17
-
```txt
18
-
Writing logs to quotes.log. Press Ctrl+C to stop.
19
-
```
17
+
```txt
18
+
Writing logs to quotes.log. Press Ctrl+C to stop.
19
+
```
20
20
21
-
-**Start the Gateway**: In the **Gateway** terminal window navigate to the `[WORKSHOP]/7-transform-data` directory and run:
21
+
**Start the Gateway**: In the **Gateway** terminal window navigate to the `[WORKSHOP]/7-transform-data` directory and run:
22
22
23
-
```sh
24
-
../otelbin --config=gateway.yaml
25
-
```
23
+
```sh
24
+
../otelbin --config=gateway.yaml
25
+
```
26
26
27
-
-**Start the Agent**: In the **Agent** terminal window navigate to the `[WORKSHOP]/7-transform-data` directory and run:
27
+
**Start the Agent**: In the **Agent** terminal window navigate to the `[WORKSHOP]/7-transform-data` directory and run:
-**Check the debug output** of both the **Agent** and **Gateway**to confirm that `com.splunk/source` and `os.type` have been removed.
17
+
**Check the debug output**: For both the **Agent** and **Gateway** confirm that `com.splunk/source` and `os.type` have been removed:
18
18
19
19
{{% tabs %}}
20
20
{{% tab title="New Debug Output" %}}
@@ -41,7 +41,7 @@ This ensures proper metadata filtering, severity mapping, and structured log enr
41
41
{{% /tab %}}
42
42
{{% /tabs %}}
43
43
44
-
-**Check the debug output** of both the **Agent** and **Gateway**to confirm that `SeverityText` and `SeverityNumber` in the `LogRecord` is now defined with the severity `level` from the log body. Confirm that the JSON fields from the body can be accessed as top-level log `Attributes`.
44
+
**Check the debug output**: For both the **Agent** and **Gateway** confirm that `SeverityText` and `SeverityNumber` in the `LogRecord` is now defined with the severity `level` from the log body. Confirm that the JSON fields from the body can be accessed as top-level log `Attributes`:
45
45
46
46
{{% tabs %}}
47
47
{{% tab title="New Debug Output" %}}
@@ -85,7 +85,7 @@ This ensures proper metadata filtering, severity mapping, and structured log enr
85
85
{{% /tab %}}
86
86
{{% /tabs %}}
87
87
88
-
-**Check**the new `gateway-logs.out` file to verify the data has been transformed:
88
+
**Check file output**: In the new `gateway-logs.out` file verify the data has been transformed:
0 commit comments