You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
📝 Add docstrings to codex/extend-ci-pipeline-with-performance-thresholds
Docstrings generation was requested by @shayancoin.
* #123 (comment)
The following files were modified:
* `frontend/tests/perf/run-perf-budget.ts`
* `scripts/ci/check_canary_metrics.py`
@@ -185,6 +217,11 @@ async function setupPerformanceObservers(page: Page) {
185
217
});
186
218
}
187
219
220
+
/**
221
+
* Performs a configured wait step on the given page, supporting selector and network-idle waits.
222
+
*
223
+
* @param wait - Wait step configuration. If `type` is `"selector"`, waits for `selector` with a default timeout of 30000 ms unless `timeout_ms` is provided. If `type` is `"networkidle"`, waits for the page network to become idle with a default timeout of 60000 ms unless `timeout_ms` is provided; if `idle_ms` is set and greater than zero, waits an additional `idle_ms` milliseconds after network idle.
@@ -310,6 +390,12 @@ function aggregate(values: number[], aggregation: Aggregation | string): number
310
390
}
311
391
}
312
392
393
+
/**
394
+
* Builds a JUnit-compatible report representing scenario metric results and budget violations.
395
+
*
396
+
* @param results - Array of scenario results to include in the report
397
+
* @returns A JUnitReport containing one test suite per scenario; each metric is a test case and metrics that exceeded their thresholds are represented as failures
* Executes all configured performance scenarios, aggregates their metrics, and produces reports.
504
+
*
505
+
* Reads the performance budget configuration, runs each scenario the configured number of times, aggregates metric samples according to each metric's aggregation strategy, evaluates them against thresholds, writes a JSON summary and a JUnit XML report to the results directory, and sets a non-zero process exit code when any metric violates its threshold.
506
+
*/
396
507
asyncfunctionrun(){
397
508
constconfig=readConfig();
398
509
constdefaults=config.defaults??{};
@@ -463,4 +574,4 @@ async function run() {
463
574
run().catch((error)=>{
464
575
console.error('Failed to execute performance budgets',error);
Query a Prometheus instant query endpoint and return the numeric result if present.
92
+
93
+
Returns:
94
+
float: Numeric value extracted from the Prometheus response, or `None` if the HTTP request failed or the response did not contain a usable numeric result.
Collect canary metrics from Prometheus and Tempo based on environment configuration.
142
+
143
+
Attempts to query Prometheus for current P95 latency and error rate and, when configured, previous-period metrics and Tempo trace P95 latency. Required environment variables for live collection are PROMETHEUS_URL, PROMETHEUS_LATENCY_QUERY, and PROMETHEUS_ERROR_QUERY. Optional environment variables:
144
+
- PROMETHEUS_PREVIOUS_LATENCY_QUERY, PROMETHEUS_PREVIOUS_ERROR_QUERY: queries for previous metrics.
145
+
- TEMPO_URL, TEMPO_TRACE_QUERY: Tempo search API and query for trace latency.
146
+
- BUILD_TAG or GITHUB_SHA: current build identifier.
147
+
- PREVIOUS_BUILD_TAG: previous build identifier.
148
+
149
+
If Prometheus is unreachable, missing required configuration, or returns no data for the primary latency or error queries, the function returns None to signal that callers should fall back to fixture data. On success, returns a CanaryMetrics instance populated with collected values (including trace_latency_p95_ms when available), previous values when provided, build metadata, and a UTC ISO-like generated_at timestamp.
150
+
151
+
Returns:
152
+
Optional[CanaryMetrics]: A populated CanaryMetrics object when live collection succeeds, or `None` when live data is unavailable and a fixture should be used.
Load canary metrics from configured services, falling back to a JSON fixture when live collection is unavailable.
214
+
215
+
If environment and service queries provide metrics, those are returned; otherwise the fixture path from CANARY_METRICS_FIXTURE (or the default "tests/perf/canary-metrics.fixture.json") is used and its contents are returned. The chosen fixture path is printed when the fallback is used.
216
+
217
+
Returns:
218
+
CanaryMetrics: Collected canary metrics and metadata, sourced from live services when available or from the fixture otherwise.
Run the canary metric validation flow, emit a human-readable summary, write a pass/fail summary file, and return an exit code.
312
+
313
+
The function loads metrics, evaluates them against configured thresholds, prints status and comparisons to stdout/stderr, writes a JSON summary file indicating pass or fail, and exits with a code appropriate to the result.
314
+
315
+
Returns:
316
+
int: `0` if all checks pass, `1` if any check fails.
0 commit comments