You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
Impact metrics are lightweight, application-level time-series metrics stored and visualized directly inside Unleash. They allow you to connect specific application data, such as request counts, error rates, or latency, to your feature flags and release plans.
325
+
326
+
These metrics help validate feature impact and automate release processes. For instance, you can monitor usage patterns or performance to determine if a feature meets its goals.
327
+
328
+
The SDK automatically attaches context labels to metrics: `appName` and `environment`.
329
+
330
+
#### Counters
331
+
332
+
Use counters for cumulative values that only increase (total requests, errors):
Impact metrics are batched and sent using the same interval as standard SDK metrics.
373
+
322
374
### Custom cache
323
375
324
376
By default, the Python SDK stores feature flags in an on-disk cache using fcache. If you need a different storage backend, for example, Redis, memory-only, or a custom database, you can provide your own cache implementation.
0 commit comments