You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
Copy file name to clipboardExpand all lines: CLAUDE.md
-51Lines changed: 0 additions & 51 deletions
Display the source diff
Display the rich diff
Original file line number
Diff line number
Diff line change
@@ -12,25 +12,6 @@ ESSlivedata is a live data reduction visualization framework for the European Sp
12
12
13
13
**IMPORTANT**: In the devcontainer, this project uses micromamba with Python 3.11 in the base environment. The environment is automatically activated - you do not need to activate it manually.
14
14
15
-
For manual setup outside the devcontainer (if needed):
16
-
17
-
```sh
18
-
# Create virtual environment with Python 3.11
19
-
python3.11 -m venv venv
20
-
21
-
# Activate the virtual environment
22
-
source venv/bin/activate
23
-
24
-
# Install all development dependencies
25
-
pip install -r requirements/dev.txt
26
-
27
-
# Install package in editable mode
28
-
pip install -e .
29
-
30
-
# Setup pre-commit hooks (automatically runs on git commit)
31
-
pre-commit install
32
-
```
33
-
34
15
**Note**: In the devcontainer, all Python commands (`python`, `pytest`, `tox`, etc.) automatically use the micromamba base environment. Pre-commit hooks will run automatically on `git commit` if properly installed.
-`ConfigService`: Central configuration management with Pydantic models
201
176
-`DataService`: Manages data streams and notifies subscribers
202
177
-`WorkflowController`: Orchestrates workflow configuration and execution
203
-
-`ConfigBackedParam`: Translation layer between Param widgets and Pydantic models
204
178
205
179
### Dashboard Architecture
206
180
@@ -233,18 +207,6 @@ For in-depth understanding of ESSlivedata's architecture, see the following desi
233
207
## Service Factory Pattern
234
208
235
209
New services are created using `DataServiceBuilder`:
236
-
237
-
```python
238
-
builder = DataServiceBuilder(
239
-
instrument='dummy',
240
-
name='my_service',
241
-
preprocessor_factory=MyPreprocessorFactory(),
242
-
adapter=MyMessageAdapter() # optional
243
-
)
244
-
service = builder.build_from_config(topics=[...])
245
-
service.start()
246
-
```
247
-
248
210
All services use `OrchestratingProcessor` for job-based processing. See [src/ess/livedata/service_factory.py](src/ess/livedata/service_factory.py) for details.
2. Implement `make_preprocessor()` to create accumulators for different stream types
304
-
3. Register workflows with the instrument configuration
305
-
4. Use `DataServiceBuilder` to construct service
306
-
5. Add service module in `services/`
307
-
6. Add instrument configuration in `config/defaults/`
308
-
309
-
### Adding Dashboard Widgets
310
-
311
-
1. Workflow and plotter configuration uses widgets generated from Pydantic model for validation on the frontend and serialization for Kafka communication
312
-
313
262
### Message Processing
314
263
315
264
- Messages have `timestamp`, `stream`, and `value` fields
0 commit comments