Skip to content

Commit f0c947b

Browse files
committed
Transform formatting
1 parent e6b826f commit f0c947b

File tree

3 files changed

+65
-63
lines changed

3 files changed

+65
-63
lines changed

content/en/ninja-workshops/10-advanced-otel/7-transform-data/7-1-configuration.md

Lines changed: 46 additions & 44 deletions
Original file line numberDiff line numberDiff line change
@@ -6,61 +6,63 @@ weight: 1
66

77
{{% notice title="Exercise" style="green" icon="running" %}}
88

9-
- **Configure the `transform/logs` processor**: In the `agent.yaml` apply the processor to `log_statements` in the `resource` context and retain only relevant resource attributes (`com.splunk.sourcetype`, `host.name`, `otelcol.service.mode`):
9+
**Configure the `transform/logs` processor**: In the `agent.yaml` apply the processor to `log_statements` in the `resource` context and retain only relevant resource attributes (`com.splunk.sourcetype`, `host.name`, `otelcol.service.mode`):
1010

11-
```yaml
12-
transform/logs: # Processor Type/Name
13-
log_statements: # Log Processing Statements
14-
- context: resource # Log Context
15-
statements: # List of attribute keys to keep
16-
- keep_keys(attributes, ["com.splunk.sourcetype", "host.name", "otelcol.service.mode"])
17-
```
11+
```yaml
12+
transform/logs: # Processor Type/Name
13+
log_statements: # Log Processing Statements
14+
- context: resource # Log Context
15+
statements: # List of attribute keys to keep
16+
- keep_keys(attributes, ["com.splunk.sourcetype", "host.name", "otelcol.service.mode"])
17+
```
18+
19+
This configuration ensures that only the specified attributes are retained, improving log efficiency and reducing unnecessary metadata.
1820
19-
This configuration ensures that only the specified attributes are retained, improving log efficiency and reducing unnecessary metadata.
20-
- **Adding a Context Block for Log Severity Mapping**: To properly set the `severity_text` and `severity_number` fields of a log record, add another log context block within log_statements. This configuration extracts the `level` value from the log body, maps it to `severity_text`, and assigns the appropriate `severity_number`:
21+
**Adding a Context Block for Log Severity Mapping**: To properly set the `severity_text` and `severity_number` fields of a log record, add another log context block within log_statements. This configuration extracts the `level` value from the log body, maps it to `severity_text`, and assigns the appropriate `severity_number`:
2122

22-
```yaml
23-
- context: log # Log Context
24-
statements: # Transform Statements Array
25-
- set(cache, ParseJSON(body)) where IsMatch(body, "^\\{")
26-
- flatten(cache, "")
27-
- merge_maps(attributes, cache, "upsert")
28-
- set(severity_text, attributes["level"])
29-
- set(severity_number, 1) where severity_text == "TRACE"
30-
- set(severity_number, 5) where severity_text == "DEBUG"
31-
- set(severity_number, 9) where severity_text == "INFO"
32-
- set(severity_number, 13) where severity_text == "WARN"
33-
- set(severity_number, 17) where severity_text == "ERROR"
34-
- set(severity_number, 21) where severity_text == "FATAL"
35-
```
23+
```yaml
24+
- context: log # Log Context
25+
statements: # Transform Statements Array
26+
- set(cache, ParseJSON(body)) where IsMatch(body, "^\\{")
27+
- flatten(cache, "")
28+
- merge_maps(attributes, cache, "upsert")
29+
- set(severity_text, attributes["level"])
30+
- set(severity_number, 1) where severity_text == "TRACE"
31+
- set(severity_number, 5) where severity_text == "DEBUG"
32+
- set(severity_number, 9) where severity_text == "INFO"
33+
- set(severity_number, 13) where severity_text == "WARN"
34+
- set(severity_number, 17) where severity_text == "ERROR"
35+
- set(severity_number, 21) where severity_text == "FATAL"
36+
```
3637

37-
- **Summary of Key Transformations**:
38-
- **Parse JSON**: Extracts structured data from the log body.
39-
- **Flatten JSON**: Converts nested JSON objects into a flat structure.
40-
- **Merge Attributes**: Integrates extracted data into log attributes.
41-
- **Map Severity Text**: Assigns severity_text from the log’s level attribute.
42-
- **Assign Severity Numbers**: Converts severity levels into standardized numerical values.
38+
**Summary of Key Transformations**:
4339

44-
This setup ensures that log severity is properly extracted, standardized, and structured for efficient processing.
40+
- **Parse JSON**: Extracts structured data from the log body.
41+
- **Flatten JSON**: Converts nested JSON objects into a flat structure.
42+
- **Merge Attributes**: Integrates extracted data into log attributes.
43+
- **Map Severity Text**: Assigns severity_text from the log’s level attribute.
44+
- **Assign Severity Numbers**: Converts severity levels into standardized numerical values.
45+
46+
This setup ensures that log severity is properly extracted, standardized, and structured for efficient processing.
4547

4648
{{% notice title="Tip" style="primary" icon="lightbulb" %}}
4749
This method of mapping all JSON fields to top-level attributes should only be used for **testing and debugging OTTL**. It will result in high cardinality in a production scenario.
4850
{{% /notice %}}
4951

50-
- **Update the `logs` pipeline**: Add the `transform/logs:` processor into the `logs:` pipeline:
52+
**Update the `logs` pipeline**: Add the `transform/logs:` processor into the `logs:` pipeline:
5153

52-
```yaml
53-
logs: # Logs Pipeline
54-
receivers: # Array of receivers in this pipeline
55-
- filelog/quotes
56-
- otlp
57-
processors: # Array of Processors in this pipeline
58-
- memory_limiter # You also could use [memory_limiter]
59-
- resourcedetection
60-
- resource/add_mode
61-
- transform/logs
62-
- batch
63-
```
54+
```yaml
55+
logs: # Logs Pipeline
56+
receivers: # Array of receivers in this pipeline
57+
- filelog/quotes
58+
- otlp
59+
processors: # Array of Processors in this pipeline
60+
- memory_limiter # You also could use [memory_limiter]
61+
- resourcedetection
62+
- resource/add_mode
63+
- transform/logs
64+
- batch
65+
```
6466

6567
{{% /notice %}}
6668

content/en/ninja-workshops/10-advanced-otel/7-transform-data/7-2-setup.md

Lines changed: 16 additions & 16 deletions
Original file line numberDiff line numberDiff line change
@@ -6,28 +6,28 @@ weight: 2
66

77
{{% notice title="Exercise" style="green" icon="running" %}}
88

9-
- **Start the Log Generator**: In the **Test** terminal window, navigate to the `[WORKSHOP]/7-transform-data` directory and start the appropriate `log-gen` script for your system. We want to work with structured JSON logs, so add the `-json` flag.
9+
**Start the Log Generator**: In the **Test** terminal window, navigate to the `[WORKSHOP]/7-transform-data` directory and start the appropriate `log-gen` script for your system. We want to work with structured JSON logs, so add the `-json` flag.
1010

11-
```sh
12-
./log-gen.sh -json
13-
```
11+
```sh
12+
./log-gen.sh -json
13+
```
1414

15-
The script will begin writing lines to a file named `./quotes.log`, while displaying a single line of output in the console.
15+
The script will begin writing lines to a file named `./quotes.log`, while displaying a single line of output in the console.
1616

17-
```txt
18-
Writing logs to quotes.log. Press Ctrl+C to stop.
19-
```
17+
```txt
18+
Writing logs to quotes.log. Press Ctrl+C to stop.
19+
```
2020

21-
- **Start the Gateway**: In the **Gateway** terminal window navigate to the `[WORKSHOP]/7-transform-data` directory and run:
21+
**Start the Gateway**: In the **Gateway** terminal window navigate to the `[WORKSHOP]/7-transform-data` directory and run:
2222

23-
```sh
24-
../otelbin --config=gateway.yaml
25-
```
23+
```sh
24+
../otelbin --config=gateway.yaml
25+
```
2626

27-
- **Start the Agent**: In the **Agent** terminal window navigate to the `[WORKSHOP]/7-transform-data` directory and run:
27+
**Start the Agent**: In the **Agent** terminal window navigate to the `[WORKSHOP]/7-transform-data` directory and run:
2828

29-
```sh
30-
../otelbin --config=agent.yaml
31-
```
29+
```sh
30+
../otelbin --config=agent.yaml
31+
```
3232

3333
{{% /notice %}}

content/en/ninja-workshops/10-advanced-otel/7-transform-data/7-3-test-transform.md

Lines changed: 3 additions & 3 deletions
Original file line numberDiff line numberDiff line change
@@ -14,7 +14,7 @@ This ensures proper metadata filtering, severity mapping, and structured log enr
1414

1515
{{% notice title="Exercise" style="green" icon="running" %}}
1616

17-
- **Check the debug output** of both the **Agent** and **Gateway** to confirm that `com.splunk/source` and `os.type` have been removed.
17+
**Check the debug output**: For both the **Agent** and **Gateway** confirm that `com.splunk/source` and `os.type` have been removed:
1818

1919
{{% tabs %}}
2020
{{% tab title="New Debug Output" %}}
@@ -41,7 +41,7 @@ This ensures proper metadata filtering, severity mapping, and structured log enr
4141
{{% /tab %}}
4242
{{% /tabs %}}
4343

44-
- **Check the debug output** of both the **Agent** and **Gateway** to confirm that `SeverityText` and `SeverityNumber` in the `LogRecord` is now defined with the severity `level` from the log body. Confirm that the JSON fields from the body can be accessed as top-level log `Attributes`.
44+
**Check the debug output**: For both the **Agent** and **Gateway** confirm that `SeverityText` and `SeverityNumber` in the `LogRecord` is now defined with the severity `level` from the log body. Confirm that the JSON fields from the body can be accessed as top-level log `Attributes`:
4545

4646
{{% tabs %}}
4747
{{% tab title="New Debug Output" %}}
@@ -85,7 +85,7 @@ This ensures proper metadata filtering, severity mapping, and structured log enr
8585
{{% /tab %}}
8686
{{% /tabs %}}
8787

88-
- **Check** the new `gateway-logs.out` file to verify the data has been transformed:
88+
**Check file output**: In the new `gateway-logs.out` file verify the data has been transformed:
8989

9090
{{% tabs %}}
9191
{{% tab title="New File Output" %}}

0 commit comments

Comments
 (0)