@@ -129,35 +129,68 @@ passing in the registry that was passed to `PrometheusMiddleware`.
129129
130130## Prometheus Metrics in Multi-Process Mode 
131131
132- When running `FastStream` with multiple worker processes, you need to configure Prometheus metrics collection specially: 
133- 
134- 1. Set the `PROMETHEUS_MULTIPROC_DIR` environment variable to a writable directory 
135- 2. Initialize your collector registry with multiprocess support: 
136- 
137-     ```python linenums="1" hl_lines="8" 
138-     from prometheus_client import CollectorRegistry, multiprocess 
139-     import os 
140- 
141-     multiprocess_dir = os.getenv("PROMETHEUS_MULTIPROC_DIR") 
142- 
143-     registry = CollectorRegistry() 
144-     if multiprocess_dir: 
145-         multiprocess.MultiProcessCollector(registry, path=multiprocess_dir) 
146- 
147-     broker = KafkaBroker( 
148-         middlewares=[ 
149-             KafkaPrometheusMiddleware( 
150-                 registry=registry, 
151-                 app_name="your-app-name" 
152-             ) 
153-         ] 
154-     ) 
155-     ``` 
156- 
157- The metrics directory must: 
158- * Exist before application start 
159- * Be writable by all worker processes 
160- * Be on a filesystem accessible to all workers 
132+ When running FastStream with multiple worker processes, follow these steps to properly configure Prometheus metrics collection: 
133+ 
134+ ### Basic Configuration 
135+ 1. Set the `PROMETHEUS_MULTIPROC_DIR` environment variable to a writable directory: 
136+    ```bash 
137+    export PROMETHEUS_MULTIPROC_DIR=/path/to/metrics/directory 
138+    ``` 
139+     
140+ 2. The metrics will automatically work in multiprocess mode when the environment variable is set. Here'  s a minimal working example:
141+ 
142+ ` ` ` python linenums=" 1"   hl_lines=" 8" 
143+ import os 
144+ 
145+ from prometheus_client import CollectorRegistry 
146+ 
147+ from faststream.kafka import KafkaBroker 
148+ from faststream.kafka.prometheus import KafkaPrometheusMiddleware 
149+ 
150+ broker = KafkaBroker( 
151+     middlewares=[ 
152+         KafkaPrometheusMiddleware( 
153+             registry=CollectorRegistry(),  
154+             app_name=" your-app-name"  
155+         ) 
156+     ] 
157+ ) 
158+ ` ` ` 
159+ 
160+ # ## Metrics Export Endpoint
161+ For exporting metrics in  multi-process mode, you need a special endpoint:
162+ 
163+ ` ` ` python linenums=" 1"   hl_lines=" 8" 
164+ import os 
165+ 
166+ from prometheus_client import CollectorRegistry, multiprocess, generate_latest 
167+ from prometheus_client import CONTENT_TYPE_LATEST 
168+ 
169+ from faststream.asgi import AsgiResponse 
170+ 
171+ registry = CollectorRegistry () 
172+ 
173+ @get 
174+ async def metrics(scope): 
175+     if  path := os.environ.get(" PROMETHEUS_MULTIPROC_DIR"  ): 
176+         registry_ = CollectorRegistry () 
177+         multiprocess.MultiProcessCollector(registry_, path=path) 
178+     else: 
179+         registry_ = registry 
180+ 
181+     headers = {" Content-Type"  : CONTENT_TYPE_LATEST} 
182+     return  AsgiResponse(generate_latest(registry_), status_code=200, headers=headers) 
183+ ` ` ` 
184+ 
185+ # ## Important Requirements
186+ 1. The metrics directory must: 
187+    - Exist before application start 
188+    - Be writable by all worker processes 
189+    - Be on a filesystem accessible to all workers 
190+    - Be emptied between application runs 
191+ 2. For better performance: 
192+    - Consider mounting the directory on `  tmpfs` 
193+    - Set up regular cleanup of old metric files 
161194
162195# ## Grafana dashboard
163196
0 commit comments