Skip to content

Commit cdadb39

Browse files
committed
1 parent 0d1bcd1 commit cdadb39

File tree

6 files changed

+2527
-189
lines changed

6 files changed

+2527
-189
lines changed

jmeter/CHANGELOG.md

Lines changed: 25 additions & 4 deletions
Original file line numberDiff line numberDiff line change
@@ -1,8 +1,29 @@
1-
# CHANGELOG - JMeter
1+
# Changes
22

3+
## 0.6.0
4+
* [Added] Add cumulative metrics support to mirror JMeter's Aggregate Report.
5+
* [Added] Add `statisticsCalculationMode` configuration option to control percentile calculation algorithms (`ddsketch`, `aggregate_report`, `dashboard`).
6+
* [Added] Add assertion metrics to track success and failure of assertions.
7+
* [Added] Add Datadog Events for test start and test end.
38

4-
## 1.0.0
9+
## 0.5.0
10+
* [Added] Add ability to exclude sample results to be sent as logs based on response code regex
11+
See [#47](https://github.com/DataDog/jmeter-datadog-backend-listener/issues/47)
512

6-
***Added***:
13+
## 0.4.0
14+
* [Changed] Set configured tags on plugin generated logs. (See [#45](https://github.com/DataDog/jmeter-datadog-backend-listener/pull/45)).
715

8-
* Initial release.
16+
## 0.3.1
17+
* [Fixed] Setting `includeSubresults` to `true` will now also include the parent results as well as subresults recursively (See [#35](https://github.com/DataDog/jmeter-datadog-backend-listener/pull/35)).
18+
19+
## 0.3.0
20+
* [Added] Add ability to release to Maven Central. See [#26](https://github.com/DataDog/jmeter-datadog-backend-listener/pull/26)
21+
* [Added] Add custom tags to global metrics. See [#23](https://github.com/DataDog/jmeter-datadog-backend-listener/pull/23)
22+
23+
## 0.2.0
24+
* [Added] Add `customTags` config option. See [#15](https://github.com/DataDog/jmeter-datadog-backend-listener/pull/15)
25+
* [Added] Tag metrics by `thread_group`. See [#17](https://github.com/DataDog/jmeter-datadog-backend-listener/pull/17)
26+
* [Added] Add `thread_group` to log payload. See [#18](https://github.com/DataDog/jmeter-datadog-backend-listener/pull/18)
27+
28+
## 0.1.0
29+
Initial release

jmeter/README.md

Lines changed: 80 additions & 16 deletions
Original file line numberDiff line numberDiff line change
@@ -1,27 +1,37 @@
1-
# Agent Check: JMeter
1+
# Datadog Backend Listener for Apache : JMeter
2+
3+
![screenshot](images/screenshot.png)
24

35
## Overview
46

5-
Datadog Backend Listener for Apache JMeter is an open source JMeter plugin used to send test results to the Datadog platform. It provides real-time reporting of test metrics like latency, the number of bytes sent and received, and more. You can also send to Datadog complete test results as log entries.
7+
Datadog Backend Listener for Apache JMeter is a JMeter plugin used to send test results to the Datadog platform. It includes the following features:
8+
9+
- Real time reporting of test metrics (latency, bytes sent and more). See the Metrics section.
10+
- Real time reporting of test results as Datadog log events.
11+
- Ability to include sub results.
612

713
## Setup
814

915
### Installation
1016

1117
The Datadog Backend Listener plugin needs to be installed manually. See the latest release and more up-to-date installation instructions on its [GitHub repository][1].
1218

19+
You can install the plugin either manually or with JMeter Plugins Manager.
20+
21+
No Datadog Agent is necessary.
22+
1323
#### Manual installation
1424

15-
1. Download the Datadog plugin JAR file from the [release page][5]
25+
1. Download the Datadog plugin JAR file from the [release page][5].
1626
2. Place the JAR in the `lib/ext` directory within your JMeter installation.
1727
3. Launch JMeter (or quit and re-open the application).
1828

1929
#### JMeter plugins Manager
2030

2131
1. If not already configured, download the [JMeter Plugins Manager JAR][6].
22-
2. Once you've completed the download, place the `.jar` in the `lib/ext` directory within your JMeter installation.
23-
3. Launch JMeter (or quit and re-open the application).
24-
4. Go to `Options > Plugins Manager > Available Plugins`.
32+
2. Once you've completed the download, place the `.jar` in the `lib/ext` directory within your JMeter installation.
33+
3. Launch JMeter (or quit and re-open the application).
34+
4. Go to `Options > Plugins Manager > Available Plugins`.
2535
5. Search for "Datadog Backend Listener".
2636
6. Click the checbox next to the Datadog Backend Listener plugin.
2737
7. Click "Apply Changes and Restart JMeter".
@@ -30,9 +40,9 @@ The Datadog Backend Listener plugin needs to be installed manually. See the late
3040

3141
To start reporting metrics to Datadog:
3242

33-
1. Right click on the thread group or the test plan for which you want to send metrics to Datadog.
43+
1. Right click on the thread group or the test plan for which you want to send metrics to Datadog.
3444
2. Go to `Add > Listener > Backend Listener`.
35-
3. Modify the `Backend Listener Implementation` and select `org.datadog.jmeter.plugins.DatadogBackendClient` from the drop-down.
45+
3. Modify the `Backend Listener Implementation` and select `org.datadog.jmeter.plugins.DatadogBackendClient` from the drop-down.
3646
4. Set the `apiKey` variable to [your Datadog API key][7].
3747
5. Run your test and validate that metrics have appeared in Datadog.
3848

@@ -41,39 +51,93 @@ The plugin has the following configuration options:
4151
| Name | Required | Default value | description|
4252
|------------|:--------:|---------------|------------|
4353
|apiKey | true | NA | Your Datadog API key.|
44-
|datadogUrl | false | https://api.datadoghq.com/api/ | You can configure a different endpoint, for instance https://api.datadoghq.eu/api/ if your datadog instance is in the EU|
45-
|logIntakeUrl | false | https://http-intake.logs.datadoghq.com/v1/input/ | You can configure a different endpoint, for instance https://http-intake.logs.datadoghq.eu/v1/input/ if your datadog instance is in the EU.|
46-
|metricsMaxBatchSize|false|200|Metrics are submitted every 10 seconds in batches of size `metricsMaxBatchSize`.|
54+
|datadogUrl | false | <https://api.datadoghq.com/api/> | You can configure a different endpoint, for instance <https://api.datadoghq.eu/api/> if your datadog instance is in the EU|
55+
|logIntakeUrl | false | <https://http-intake.logs.datadoghq.com/v1/input/> | You can configure a different endpoint, for instance <https://http-intake.logs.datadoghq.eu/v1/input/> if your datadog instance is in the EU|
56+
|metricsMaxBatchSize|false|200|Metrics are submitted every 10 seconds in batches of size `metricsMaxBatchSize`|
4757
|logsBatchSize|false|500|Logs are submitted in batches of size `logsBatchSize` as soon as this size is reached.|
48-
|sendResultsAsLogs|false|false|By default only metrics are reported to Datadog. To report individual test results as log events, set this field to `true`.|
58+
|sendResultsAsLogs|false|true|By default, individual test results are reported as log events. Set to `false` to disable log reporting.|
4959
|includeSubresults|false|false|A subresult is for instance when an individual HTTP request has to follow redirects. By default subresults are ignored.|
5060
|excludeLogsResponseCodeRegex|false|`""`| Setting `sendResultsAsLogs` will submit all results as logs to Datadog by default. This option lets you exclude results whose response code matches a given regex. For example, you may set this option to `[123][0-5][0-9]` to only submit errors.|
51-
|samplersRegex|false|.*|An optional regex to filter the samplers to monitor.|
52-
|customTags|false|`""`|Comma-separated list of tags to add to every metric
61+
|samplersRegex|false|`""`|Regex to filter which samplers to include. By default all samplers are included.|
62+
|customTags|false|`""`|Comma-separated list of tags to add to every metric.|
63+
|statisticsCalculationMode|false|`ddsketch`|Algorithm for percentile calculation: `ddsketch` (default), `aggregate_report` (matches JMeter Aggregate Reports), or `dashboard` (matches JMeter HTML Dashboards).|
64+
65+
## Statistics Calculation Modes
66+
67+
- **ddsketch** (default): Uses Datadog's DDSketch algorithm. It computes approximate percentiles with a 1% error guarantee compared to the exact percentile value, while using very little memory. When compared with `aggregate_report`, results may differ more because `aggregate_report` picks the nearest observed value for each percentile, which can cause noticeable jumps when there are few samples.
68+
- **aggregate_report**: Matches JMeter's "Aggregate Reports" listener. It stores all response times in memory and calculates percentiles using the "nearest rank" method (nearest exact value from the dataset).
69+
- **dashboard**: Uses a sliding window and linear interpolation (by default) to calculate percentiles, matching JMeter's HTML Dashboards. This mode may diverge significantly from the others when the limit of the sliding window is reached (default 20,000, but [configurable](https://jmeter.apache.org/usermanual/properties_reference.html#reporting)).
70+
71+
## Assertion Failures vs Errors
72+
73+
JMeter distinguishes between assertion failures and assertion errors. A failure means the assertion evaluated and did not pass. An error means the assertion could not be evaluated (for example, a null response or a script error). These map to `jmeter.assertions.failed` and `jmeter.assertions.error`.
74+
75+
## Getting Final Results in Datadog Notebooks
76+
77+
To match JMeter's Aggregate Reports in a Datadog notebook, set `statisticsCalculationMode=aggregate_report` and query the `jmeter.final_result.*` metrics. These are emitted once at test end, so they are ideal for a single, authoritative snapshot.
78+
79+
**Note**: Since these metrics are emitted only once at the end of the test, ensure your selected time interval includes the test completion time.
80+
81+
Example queries (adjust tags as needed):
82+
83+
```text
84+
avg:jmeter.final_result.response_time.p95{sample_label:total,test_run_id:YOUR_RUN_ID}
85+
avg:jmeter.final_result.responses.error_percent{sample_label:total,test_run_id:YOUR_RUN_ID}
86+
avg:jmeter.final_result.throughput.rps{sample_label:total,test_run_id:YOUR_RUN_ID}
87+
```
88+
89+
**Distributed Mode**: In distributed tests, each runner calculates percentiles independently. Add the `runner_id` tag to filter by a specific runner (e.g., `runner_id:runner-1`).
90+
91+
## Test Run Tagging
92+
93+
The plugin automatically adds a `test_run_id` tag to all metrics, logs, and events (Test Started/Ended) to help you isolate and filter specific test executions in Datadog.
94+
95+
- **Format**: `{ISO-8601 timestamp}-{hostname}-{random8chars}`
96+
- Example: `2026-01-24T14:30:25Z-myhost-a1b2c3d4`
97+
- In distributed mode, the `hostname` becomes the `runner_id` (the JMeter distributed prefix) when present.
98+
99+
You can override this by providing your own `test_run_id` in the `customTags` configuration (e.g., `test_run_id:my-custom-run-id`). Any additional tags you add to `customTags` will also be included alongside the `test_run_id`.
53100

54101
## Data Collected
55102

56103
### Metrics
57104

58105
See [metadata.csv][2] for a list of metrics provided by this check.
59106

107+
The plugin emits three types of metrics:
108+
109+
- **Interval metrics** (`jmeter.*`): Real-time metrics reset each reporting interval, useful for monitoring during test execution.
110+
- **Cumulative metrics** (`jmeter.cumulative.*`): Aggregate statistics over the entire test duration, similar to JMeter's Aggregate Reports. These include a `final_result` tag (`true` at test end, `false` during execution).
111+
- **Final result metrics** (`jmeter.final_result.*`): Emitted only once at test completion, providing an unambiguous way to query final test results without filtering by tag.
112+
60113
### Service Checks
61114

62115
JMeter does not include any service checks.
63116

64117
### Events
65118

66-
JMeter does not include any events.
119+
The plugin sends Datadog Events at the start and end of each test run:
120+
121+
- **JMeter Test Started**: Sent when the test begins
122+
- **JMeter Test Ended**: Sent when the test completes
123+
124+
These events appear in the Datadog Event Explorer and can be used to correlate metrics with test execution windows.
67125

68126
## Troubleshooting
69127

128+
If for whatever reason you are not seeing JMeter metrics in Datadog, check your `jmeter.log` file, which should be in the `/bin` folder of your JMeter installation.
129+
130+
### Not Seeing `runner_id`?
131+
132+
This is normal in local mode. The `runner_id` tag is only emitted in **distributed** tests, where JMeter provides a distributed prefix. In local runs, use `runner_host` or `runner_mode:local` for filtering instead.
133+
70134
Need help? Contact [Datadog support][3].
71135

72136
## Further Reading
73137

74138
Additional helpful documentation, links, and articles:
75139

76-
- [Monitor JMeter test results with Datadog][4]
140+
- [Monitor JMeter test results with Datadog][4]
77141

78142
[1]: https://github.com/DataDog/jmeter-datadog-backend-listener
79143
[2]: https://github.com/DataDog/integrations-core/blob/master/jmeter/metadata.csv

0 commit comments

Comments
 (0)