Skip to content

Commit f985d85

Browse files
committed
updated the count section
1 parent 5e1f74e commit f985d85

File tree

2 files changed

+67
-26
lines changed

2 files changed

+67
-26
lines changed

content/en/ninja-workshops/3-opentelemetry-collector-workshops/2-advanced-collector/9-sum-count/9-1-count-test.md

Lines changed: 13 additions & 22 deletions
Original file line numberDiff line numberDiff line change
@@ -46,54 +46,45 @@ jq '.resourceMetrics[].scopeMetrics[].metrics[]
4646
{{% /tab %}}
4747
{{% tab title="jq example output" %}}
4848

49-
```text
49+
```json
5050
{
5151
"name": "logs.sw.count",
52-
"value": "1"
53-
}
54-
{
55-
"name": "logs.error.count",
56-
"value": "1"
52+
"value": "2"
5753
}
5854
{
5955
"name": "logs.lotr.count",
60-
"value": "5"
56+
"value": "2"
6157
}
6258
{
6359
"name": "logs.full.count",
64-
"value": "6"
60+
"value": "4"
6561
}
6662
{
6763
"name": "logs.error.count",
68-
"value": "1"
69-
}
70-
{
71-
"name": "logs.lotr.count",
72-
"value": "7"
64+
"value": "2"
7365
}
7466
{
75-
"name": "logs.full.count",
76-
"value": "10"
67+
"name": "logs.error.count",
68+
"value": "1"
7769
}
7870
{
7971
"name": "logs.sw.count",
80-
"value": "3"
72+
"value": "2"
8173
}
8274
{
8375
"name": "logs.lotr.count",
84-
"value": "3"
76+
"value": "6"
8577
}
8678
{
8779
"name": "logs.full.count",
88-
"value": "4"
89-
}
90-
{
91-
"name": "logs.sw.count",
92-
"value": "1"
80+
"value": "8"
9381
}
9482
```
9583

9684
{{% /tab %}}
9785
{{% /tabs %}}
86+
{{% notice title="Tip" style="primary" icon="lightbulb" %}}
87+
Note: the `logs.full.count` should be equal to `logs.sw.count` + `logs.lotr.count`, while the `logs.error.count` will be a random
88+
{{% /notice %}}
9889

9990
{{% /notice %}}

content/en/ninja-workshops/3-opentelemetry-collector-workshops/2-advanced-collector/9-sum-count/_index.md

Lines changed: 54 additions & 4 deletions
Original file line numberDiff line numberDiff line change
@@ -37,11 +37,12 @@ The reason for the delay is that the Count Connector in the OpenTelemetry Collec
3737

3838
{{% notice title="Exercise" style="green" icon="running" %}}
3939

40-
- **Add and configure the Count Connector**
40+
- **Add the Count Connector**
4141

42-
Include the Count Connector in the connectors section of your configuration and define the metrics counters:
42+
Include the Count Connector in the connector's section of your configuration and define the metrics counters:
4343

4444
```yaml
45+
connectors:
4546
count:
4647
logs:
4748
logs.full.count:
@@ -67,11 +68,60 @@ Include the Count Connector in the connectors section of your configuration and
6768
- `logs.lotr.count`: Counts logs that contain a quote from a Lord of the Rings movie.
6869
- `logs.error.count`: Represents a real-world scenario by counting logs with a severity level of ERROR.
6970

71+
- **Configure the Count Connector in the pipelines**
72+
73+
```yaml
74+
pipelines:
75+
traces:
76+
receivers:
77+
- otlp
78+
processors:
79+
- memory_limiter
80+
- attributes/update # Update, hash, and remove attributes
81+
- redaction/redact # Redact sensitive fields using regex
82+
- resourcedetection
83+
- resource/add_mode
84+
- batch
85+
exporters:
86+
- debug
87+
- file
88+
- otlphttp
89+
metrics:
90+
receivers:
91+
- count
92+
- otlp
93+
#- hostmetrics # Host Metrics Receiver
94+
processors:
95+
- memory_limiter
96+
- resourcedetection
97+
- resource/add_mode
98+
- batch
99+
exporters:
100+
- debug
101+
- otlphttp
102+
logs:
103+
receivers:
104+
- otlp
105+
- filelog/quotes
106+
processors:
107+
- memory_limiter
108+
- resourcedetection
109+
- resource/add_mode
110+
- transform/logs # Transform logs processor
111+
- batch
112+
exporters:
113+
- count
114+
- debug
115+
- otlphttp
116+
```
117+
70118
{{% /notice %}}
71119

72-
We count logs based on their attributes. If your log data is stored in the log body instead of attributes, you’ll need to use a Transform processor in your pipeline to extract key/value pairs and add them as attributes.
120+
We count logs based on their attributes. If your log data is stored in the log body instead of attributes, you’ll need to use a `Transform` processor in your pipeline to extract key/value pairs and add them as attributes.
121+
122+
In this workshop, we’ve already added `merge_maps(attributes, cache, "upsert")` in the `07-transform` section. This ensures that all relevant data is included in the log attributes for processing.
73123

74-
In this workshop, we’ve already included `merge_maps(attributes, cache, "upsert")` in the Transform section. This ensures that all relevant data is available in the log attributes for processing.
124+
When selecting fields to create attributes from, be mindful—adding all fields indiscriminately is generally not ideal for production environments. Instead, choose only the fields that are truly necessary to avoid unnecessary data clutter.
75125

76126
{{% notice title="Exercise" style="green" icon="running" %}}
77127

0 commit comments

Comments
 (0)