Skip to content

Commit 93b447a

Browse files
committed
1 parent 0d1bcd1 commit 93b447a

File tree

8 files changed

+2538
-189
lines changed

8 files changed

+2538
-189
lines changed

jmeter/CHANGELOG.md

Lines changed: 32 additions & 4 deletions
Original file line numberDiff line numberDiff line change
@@ -1,8 +1,36 @@
1-
# CHANGELOG - JMeter
1+
# Changes
22

3+
## 0.6.0
34

4-
## 1.0.0
5+
* [Added] Add cumulative metrics support to mirror JMeter's Aggregate Report.
6+
* [Added] Add `statisticsCalculationMode` configuration option to control percentile calculation algorithms (`ddsketch`, `aggregate_report`, `dashboard`).
7+
* [Added] Add assertion metrics to track success and failure of assertions.
8+
* [Added] Add Datadog Events for test start and test end.
59

6-
***Added***:
10+
## 0.5.0
711

8-
* Initial release.
12+
* [Added] Add ability to exclude sample results to be sent as logs based on response code regex
13+
See [#47](https://github.com/DataDog/jmeter-datadog-backend-listener/issues/47)
14+
15+
## 0.4.0
16+
17+
* [Changed] Set configured tags on plugin generated logs. (See [#45](https://github.com/DataDog/jmeter-datadog-backend-listener/pull/45)).
18+
19+
## 0.3.1
20+
21+
* [Fixed] Setting `includeSubresults` to `true` will now also include the parent results as well as subresults recursively (See [#35](https://github.com/DataDog/jmeter-datadog-backend-listener/pull/35)).
22+
23+
## 0.3.0
24+
25+
* [Added] Add ability to release to Maven Central. See [#26](https://github.com/DataDog/jmeter-datadog-backend-listener/pull/26)
26+
* [Added] Add custom tags to global metrics. See [#23](https://github.com/DataDog/jmeter-datadog-backend-listener/pull/23)
27+
28+
## 0.2.0
29+
30+
* [Added] Add `customTags` config option. See [#15](https://github.com/DataDog/jmeter-datadog-backend-listener/pull/15)
31+
* [Added] Tag metrics by `thread_group`. See [#17](https://github.com/DataDog/jmeter-datadog-backend-listener/pull/17)
32+
* [Added] Add `thread_group` to log payload. See [#18](https://github.com/DataDog/jmeter-datadog-backend-listener/pull/18)
33+
34+
## 0.1.0
35+
36+
Initial release

jmeter/README.md

Lines changed: 78 additions & 15 deletions
Original file line numberDiff line numberDiff line change
@@ -1,27 +1,37 @@
1-
# Agent Check: JMeter
1+
# Datadog Backend Listener for Apache : JMeter
2+
3+
![screenshot](images/screenshot.png)
24

35
## Overview
46

5-
Datadog Backend Listener for Apache JMeter is an open source JMeter plugin used to send test results to the Datadog platform. It provides real-time reporting of test metrics like latency, the number of bytes sent and received, and more. You can also send to Datadog complete test results as log entries.
7+
Datadog Backend Listener for Apache JMeter is a JMeter plugin used to send test results to the Datadog platform. It includes the following features:
8+
9+
- Real time reporting of test metrics (latency, bytes sent and more). See the Metrics section.
10+
- Real time reporting of test results as Datadog log events.
11+
- Ability to include sub results.
612

713
## Setup
814

915
### Installation
1016

1117
The Datadog Backend Listener plugin needs to be installed manually. See the latest release and more up-to-date installation instructions on its [GitHub repository][1].
1218

19+
You can install the plugin either manually or with JMeter Plugins Manager.
20+
21+
No Datadog Agent is necessary.
22+
1323
#### Manual installation
1424

15-
1. Download the Datadog plugin JAR file from the [release page][5]
25+
1. Download the Datadog plugin JAR file from the [release page][5].
1626
2. Place the JAR in the `lib/ext` directory within your JMeter installation.
1727
3. Launch JMeter (or quit and re-open the application).
1828

1929
#### JMeter plugins Manager
2030

2131
1. If not already configured, download the [JMeter Plugins Manager JAR][6].
22-
2. Once you've completed the download, place the `.jar` in the `lib/ext` directory within your JMeter installation.
23-
3. Launch JMeter (or quit and re-open the application).
24-
4. Go to `Options > Plugins Manager > Available Plugins`.
32+
2. Once you've completed the download, place the `.jar` in the `lib/ext` directory within your JMeter installation.
33+
3. Launch JMeter (or quit and re-open the application).
34+
4. Go to `Options > Plugins Manager > Available Plugins`.
2535
5. Search for "Datadog Backend Listener".
2636
6. Click the checbox next to the Datadog Backend Listener plugin.
2737
7. Click "Apply Changes and Restart JMeter".
@@ -30,9 +40,9 @@ The Datadog Backend Listener plugin needs to be installed manually. See the late
3040

3141
To start reporting metrics to Datadog:
3242

33-
1. Right click on the thread group or the test plan for which you want to send metrics to Datadog.
43+
1. Right click on the thread group or the test plan for which you want to send metrics to Datadog.
3444
2. Go to `Add > Listener > Backend Listener`.
35-
3. Modify the `Backend Listener Implementation` and select `org.datadog.jmeter.plugins.DatadogBackendClient` from the drop-down.
45+
3. Modify the `Backend Listener Implementation` and select `org.datadog.jmeter.plugins.DatadogBackendClient` from the drop-down.
3646
4. Set the `apiKey` variable to [your Datadog API key][7].
3747
5. Run your test and validate that metrics have appeared in Datadog.
3848

@@ -42,38 +52,88 @@ The plugin has the following configuration options:
4252
|------------|:--------:|---------------|------------|
4353
|apiKey | true | NA | Your Datadog API key.|
4454
|datadogUrl | false | https://api.datadoghq.com/api/ | You can configure a different endpoint, for instance https://api.datadoghq.eu/api/ if your datadog instance is in the EU|
45-
|logIntakeUrl | false | https://http-intake.logs.datadoghq.com/v1/input/ | You can configure a different endpoint, for instance https://http-intake.logs.datadoghq.eu/v1/input/ if your datadog instance is in the EU.|
46-
|metricsMaxBatchSize|false|200|Metrics are submitted every 10 seconds in batches of size `metricsMaxBatchSize`.|
55+
|logIntakeUrl | false | https://http-intake.logs.datadoghq.com/v1/input/ | You can configure a different endpoint, for instance https://http-intake.logs.datadoghq.eu/v1/input/ if your datadog instance is in the EU|
56+
|metricsMaxBatchSize|false|200|Metrics are submitted every 10 seconds in batches of size `metricsMaxBatchSize`|
4757
|logsBatchSize|false|500|Logs are submitted in batches of size `logsBatchSize` as soon as this size is reached.|
48-
|sendResultsAsLogs|false|false|By default only metrics are reported to Datadog. To report individual test results as log events, set this field to `true`.|
58+
|sendResultsAsLogs|false|true|By default, individual test results are reported as log events. Set to `false` to disable log reporting.|
4959
|includeSubresults|false|false|A subresult is for instance when an individual HTTP request has to follow redirects. By default subresults are ignored.|
5060
|excludeLogsResponseCodeRegex|false|`""`| Setting `sendResultsAsLogs` will submit all results as logs to Datadog by default. This option lets you exclude results whose response code matches a given regex. For example, you may set this option to `[123][0-5][0-9]` to only submit errors.|
51-
|samplersRegex|false|.*|An optional regex to filter the samplers to monitor.|
52-
|customTags|false|`""`|Comma-separated list of tags to add to every metric
61+
|samplersRegex|false|`""`|Regex to filter which samplers to include. By default all samplers are included.|
62+
|customTags|false|`""`|Comma-separated list of tags to add to every metric.|
63+
|statisticsCalculationMode|false|`ddsketch`|Algorithm for percentile calculation: `ddsketch` (default), `aggregate_report` (matches JMeter Aggregate Reports), or `dashboard` (matches JMeter HTML Dashboards).|
64+
65+
#### Statistics Calculation Modes
66+
67+
- **ddsketch** (default): Uses Datadog's [DDSketch algorithm][8]. It provides approximate percentiles with a 1% error guarantee (relative to the theoretical value) and has a low memory footprint. Note that when comparing with `aggregate_report`, the difference might be greater because `aggregate_report` uses the "nearest rank" method, which introduces its own divergence due to quantization (especially with sparse values).
68+
- **aggregate_report**: Matches JMeter's "Aggregate Reports" listener. It stores all response times in memory and calculates percentiles using the "nearest rank" method (nearest exact value from the dataset).
69+
- **dashboard**: Uses a sliding window and interpolation (by default) to calculate percentiles, matching [JMeter's HTML Dashboards][9]. This mode may diverge significantly from the others when the limit of the sliding window is reached (default 20,000, but [configurable][10]).
70+
71+
#### Test Run Tagging
72+
73+
The plugin automatically adds a `test_run_id` tag to all metrics, logs, and events (Test Started/Ended) to help you isolate and filter specific test executions in Datadog.
74+
75+
- **Format**: `{hostname}-{ISO-8601 timestamp}-{random8chars}`
76+
- Example: `myhost-2026-01-24T14:30:25Z-a1b2c3d4`
77+
- In distributed mode, the `hostname` prefix becomes the `runner_id` (the JMeter distributed prefix) when present.
78+
79+
You can override this by providing your own `test_run_id` in the `customTags` configuration (e.g., `test_run_id:my-custom-run-id`). Any additional tags you add to `customTags` will also be included alongside the `test_run_id`.
80+
81+
#### Assertion Failures vs Errors
82+
83+
JMeter distinguishes between assertion failures and assertion errors. A failure means the assertion evaluated and did not pass. An error means the assertion could not be evaluated (for example, a null response or a script error). These map to `jmeter.assertions.failed` and `jmeter.assertions.error`.
84+
85+
#### Getting Final Results in Datadog Notebooks
86+
87+
To match JMeter's Aggregate Reports in a Datadog notebook, set `statisticsCalculationMode=aggregate_report` and query the `jmeter.final_result.*` metrics. These are emitted once at test end, so they are ideal for a single, authoritative snapshot.
88+
89+
**Note**: Since these metrics are emitted only once at the end of the test, ensure your selected time interval includes the test completion time.
90+
91+
Example queries (adjust tags as needed):
92+
93+
```text
94+
avg:jmeter.final_result.response_time.p95{sample_label:total,test_run_id:YOUR_RUN_ID}
95+
avg:jmeter.final_result.responses.error_percent{sample_label:total,test_run_id:YOUR_RUN_ID}
96+
avg:jmeter.final_result.throughput.rps{sample_label:total,test_run_id:YOUR_RUN_ID}
97+
```
5398

5499
## Data Collected
55100

56101
### Metrics
57102

58103
See [metadata.csv][2] for a list of metrics provided by this check.
59104

105+
The plugin emits three types of metrics:
106+
- **Interval metrics** (`jmeter.*`): Real-time metrics reset each reporting interval, useful for monitoring during test execution.
107+
- **Cumulative metrics** (`jmeter.cumulative.*`): Aggregate statistics over the entire test duration, similar to JMeter's Aggregate Reports. These include a `final_result` tag (`true` at test end, `false` during execution).
108+
- **Final result metrics** (`jmeter.final_result.*`): Emitted only once at test completion, providing an unambiguous way to query final test results without filtering by tag.
109+
60110
### Service Checks
61111

62112
JMeter does not include any service checks.
63113

64114
### Events
65115

66-
JMeter does not include any events.
116+
The plugin sends Datadog Events at the start and end of each test run:
117+
- **JMeter Test Started**: Sent when the test begins
118+
- **JMeter Test Ended**: Sent when the test completes
119+
120+
These events appear in the Datadog Event Explorer and can be used to correlate metrics with test execution windows.
67121

68122
## Troubleshooting
69123

124+
If for whatever reason you are not seeing JMeter metrics in Datadog, check your `jmeter.log` file, which should be in the `/bin` folder of your JMeter installation.
125+
126+
#### Not Seeing `runner_id`?
127+
128+
This is normal in local mode. The `runner_id` tag is only emitted in **distributed** tests, where JMeter provides a distributed prefix. In local runs, use `runner_host` or `runner_mode:local` for filtering instead.
129+
70130
Need help? Contact [Datadog support][3].
71131

72132
## Further Reading
73133

74134
Additional helpful documentation, links, and articles:
75135

76-
- [Monitor JMeter test results with Datadog][4]
136+
- [Monitor JMeter test results with Datadog][4]
77137

78138
[1]: https://github.com/DataDog/jmeter-datadog-backend-listener
79139
[2]: https://github.com/DataDog/integrations-core/blob/master/jmeter/metadata.csv
@@ -82,3 +142,6 @@ Additional helpful documentation, links, and articles:
82142
[5]: https://github.com/DataDog/jmeter-datadog-backend-listener/releases
83143
[6]: https://jmeter-plugins.org/wiki/PluginsManager/
84144
[7]: /account/settings#api
145+
[8]: https://www.datadoghq.com/blog/engineering/computing-accurate-percentiles-with-ddsketch/
146+
[9]: https://jmeter.apache.org/usermanual/generating-dashboard.html
147+
[10]: https://jmeter.apache.org/usermanual/properties_reference.html#reporting

0 commit comments

Comments
 (0)