Skip to content

Commit 414ff42

Browse files
timmarkhuffAuto-format BotTim Huff
authored
Migrate Edge Configuration to SDK (#413)
Moves the edge endpoint config schema from `edge-endpoint` into the Python SDK so users can programmatically build, validate, serialize, and deserialize edge configurations. ## What's new - New `groundlight.edge` config models ported from edge-endpoint: - `EdgeEndpointConfig` (top-level config with `global_config`) - `DetectorsConfig` (detector-scoped workflows) - `InferenceConfig` (immutable/frozen per-detector inference behavior) - `GlobalConfig`, `DetectorConfig` (supporting models) - Preset constants aligned with edge-endpoint defaults: - `DEFAULT` - `EDGE_ANSWERS_WITH_ESCALATION` - `NO_CLOUD` - `DISABLED` - Strict validation added with `extra="forbid"` across config models. ## Model design - Shared detector/inference behavior is implemented once in internal `ConfigBase`. - `DetectorsConfig` and `EdgeEndpointConfig` both inherit from `ConfigBase`. - `EdgeEndpointConfig` is top-level and flat (`global_config`, `edge_inference_configs`, `detectors`). - `DetectorsConfig` remains available as a detector-only model. ## Validation behavior - `disable_cloud_escalation=True` requires `always_return_edge_prediction=True`. - Duplicate detector IDs are rejected. - Every detector's `edge_inference_config` must exist in `edge_inference_configs`. - Inference config dict-key/name mismatch is rejected. - For payload dict input, `InferenceConfig.name` is hydrated from map keys. ## Serialization / parsing API - `EdgeEndpointConfig.from_yaml(filename=..., yaml_str=...)` - Accepts exactly one input source. - Supports file-path and YAML-string workflows explicitly. - `EdgeEndpointConfig.to_payload()` - Returns payload dict via `model_dump()`. - `EdgeEndpointConfig.from_payload(payload)` - Reconstructs from payload dict via `model_validate()`. - `DetectorsConfig.to_payload()` - Returns detector-scoped payload dict via `model_dump()`. ## Usage ```python from groundlight.edge import ( DEFAULT, EDGE_ANSWERS_WITH_ESCALATION, GlobalConfig, NO_CLOUD, DetectorsConfig, EdgeEndpointConfig, ) # Full edge endpoint config full_config = EdgeEndpointConfig( global_config=GlobalConfig(refresh_rate=30.0, confident_audit_rate=0.0), ) full_config.add_detector("det_abc123", NO_CLOUD) full_config.add_detector("det_def456", DEFAULT) payload = full_config.to_payload() round_trip = EdgeEndpointConfig.from_payload(payload) # Detector-only config detectors_config = DetectorsConfig() detectors_config.add_detector("det_abc123", NO_CLOUD) detectors_config.add_detector("det_xyz789", EDGE_ANSWERS_WITH_ESCALATION) detectors_payload = detectors_config.to_payload() ``` ## Tests Updated unit coverage in `test/unit/test_edge_config.py` includes: - add-detector behavior and conflict handling - constructor-time duplicate/cross-reference validation - hydration of inference-config names from payload keys - top-level payload acceptance and unknown-field rejection - `from_yaml` (yaml string + filename + argument-validation cases) - `to_payload` / `from_payload` round-trip - direct literal-payload `from_payload` constructor coverage - `InferenceConfig` validation constraints ## Dependencies - `pyyaml` added as a runtime dependency in `python-sdk` (`pyproject.toml`). --------- Co-authored-by: Auto-format Bot <autoformatbot@groundlight.ai> Co-authored-by: Tim Huff <thuff@axon.com>
1 parent 4e9f371 commit 414ff42

File tree

4 files changed

+538
-1
lines changed

4 files changed

+538
-1
lines changed

pyproject.toml

Lines changed: 2 additions & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -9,7 +9,7 @@ packages = [
99
{include = "**/*.py", from = "src"},
1010
]
1111
readme = "README.md"
12-
version = "0.24.0"
12+
version = "0.25.0"
1313

1414
[tool.poetry.dependencies]
1515
# For certifi, use ">=" instead of "^" since it upgrades its "major version" every year, not really following semver
@@ -22,6 +22,7 @@ python-dateutil = "^2.9.0"
2222
requests = "^2.28.2"
2323
typer = "^0.15.4"
2424
urllib3 = "^2.6.1"
25+
pyyaml = "^6.0.3"
2526

2627
[tool.poetry.group.dev.dependencies]
2728
datamodel-code-generator = "^0.35.0"

src/groundlight/edge/__init__.py

Lines changed: 23 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -0,0 +1,23 @@
1+
from .config import (
2+
DEFAULT,
3+
DISABLED,
4+
EDGE_ANSWERS_WITH_ESCALATION,
5+
NO_CLOUD,
6+
DetectorConfig,
7+
DetectorsConfig,
8+
EdgeEndpointConfig,
9+
GlobalConfig,
10+
InferenceConfig,
11+
)
12+
13+
__all__ = [
14+
"DEFAULT",
15+
"DISABLED",
16+
"EDGE_ANSWERS_WITH_ESCALATION",
17+
"NO_CLOUD",
18+
"DetectorsConfig",
19+
"DetectorConfig",
20+
"EdgeEndpointConfig",
21+
"GlobalConfig",
22+
"InferenceConfig",
23+
]

src/groundlight/edge/config.py

Lines changed: 215 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -0,0 +1,215 @@
1+
from typing import Any, Optional, Union
2+
3+
import yaml
4+
from model import Detector
5+
from pydantic import BaseModel, ConfigDict, Field, field_validator, model_validator
6+
from typing_extensions import Self
7+
8+
9+
class GlobalConfig(BaseModel):
10+
"""Global runtime settings for edge-endpoint behavior."""
11+
12+
model_config = ConfigDict(extra="forbid")
13+
14+
refresh_rate: float = Field(
15+
default=60.0,
16+
description="The interval (in seconds) at which the inference server checks for a new model binary update.",
17+
)
18+
confident_audit_rate: float = Field(
19+
default=1e-5, # A detector running at 1 FPS = ~100,000 IQ/day, so 1e-5 is ~1 confident IQ/day audited
20+
description="The probability that any given confident prediction will be sent to the cloud for auditing.",
21+
)
22+
23+
24+
class InferenceConfig(BaseModel):
25+
"""
26+
Configuration for edge inference on a specific detector.
27+
"""
28+
29+
# Keep shared presets immutable (DEFAULT/NO_CLOUD/etc.) so one mutation cannot globally change behavior.
30+
model_config = ConfigDict(extra="forbid", frozen=True)
31+
32+
name: str = Field(..., exclude=True, description="A unique name for this inference config preset.")
33+
enabled: bool = Field(
34+
default=True, description="Whether the edge endpoint should accept image queries for this detector."
35+
)
36+
api_token: Optional[str] = Field(
37+
default=None, description="API token used to fetch the inference model for this detector."
38+
)
39+
always_return_edge_prediction: bool = Field(
40+
default=False,
41+
description=(
42+
"Indicates if the edge-endpoint should always provide edge ML predictions, regardless of confidence. "
43+
"When this setting is true, whether or not the edge-endpoint should escalate low-confidence predictions "
44+
"to the cloud is determined by `disable_cloud_escalation`."
45+
),
46+
)
47+
disable_cloud_escalation: bool = Field(
48+
default=False,
49+
description=(
50+
"Never escalate ImageQueries from the edge-endpoint to the cloud. "
51+
"Requires `always_return_edge_prediction=True`."
52+
),
53+
)
54+
min_time_between_escalations: float = Field(
55+
default=2.0,
56+
description=(
57+
"The minimum time (in seconds) to wait between cloud escalations for a given detector. "
58+
"Cannot be less than 0.0. "
59+
"Only applies when `always_return_edge_prediction=True` and `disable_cloud_escalation=False`."
60+
),
61+
)
62+
63+
@model_validator(mode="after")
64+
def validate_configuration(self) -> Self:
65+
if self.disable_cloud_escalation and not self.always_return_edge_prediction:
66+
raise ValueError(
67+
"The `disable_cloud_escalation` flag is only valid when `always_return_edge_prediction` is set to True."
68+
)
69+
if self.min_time_between_escalations < 0.0:
70+
raise ValueError("`min_time_between_escalations` cannot be less than 0.0.")
71+
return self
72+
73+
74+
class DetectorConfig(BaseModel):
75+
"""
76+
Configuration for a specific detector.
77+
"""
78+
79+
model_config = ConfigDict(extra="forbid")
80+
81+
detector_id: str = Field(..., description="Detector ID")
82+
edge_inference_config: str = Field(..., description="Config for edge inference.")
83+
84+
85+
class ConfigBase(BaseModel):
86+
"""Shared detector/inference configuration behavior for edge config models."""
87+
88+
model_config = ConfigDict(extra="forbid")
89+
90+
edge_inference_configs: dict[str, InferenceConfig] = Field(default_factory=dict)
91+
detectors: list[DetectorConfig] = Field(default_factory=list)
92+
93+
@field_validator("edge_inference_configs", mode="before")
94+
@classmethod
95+
def hydrate_inference_config_names(
96+
cls, value: dict[str, InferenceConfig | dict[str, Any]] | None
97+
) -> dict[str, InferenceConfig | dict[str, Any]]:
98+
"""Hydrate InferenceConfig.name from payload mapping keys."""
99+
if value is None:
100+
return {}
101+
if not isinstance(value, dict):
102+
return value
103+
104+
hydrated_configs: dict[str, InferenceConfig | dict[str, Any]] = {}
105+
for name, config in value.items():
106+
if isinstance(config, InferenceConfig):
107+
hydrated_configs[name] = config
108+
continue
109+
if not isinstance(config, dict):
110+
raise TypeError("Each edge inference config must be an object.")
111+
hydrated_configs[name] = {"name": name, **config}
112+
return hydrated_configs
113+
114+
@model_validator(mode="after")
115+
def validate_inference_configs(self):
116+
"""
117+
Validates detector config state.
118+
Raises ValueError if dict keys mismatch InferenceConfig.name, detector IDs are duplicated,
119+
or any detector references an undefined inference config.
120+
"""
121+
for name, config in self.edge_inference_configs.items():
122+
if name != config.name:
123+
raise ValueError(f"Edge inference config key '{name}' must match InferenceConfig.name '{config.name}'.")
124+
125+
seen_detector_ids = set()
126+
duplicate_detector_ids = set()
127+
for detector_config in self.detectors:
128+
detector_id = detector_config.detector_id
129+
if detector_id in seen_detector_ids:
130+
duplicate_detector_ids.add(detector_id)
131+
else:
132+
seen_detector_ids.add(detector_id)
133+
if duplicate_detector_ids:
134+
duplicates = ", ".join(sorted(duplicate_detector_ids))
135+
raise ValueError(f"Duplicate detector IDs are not allowed: {duplicates}.")
136+
137+
for detector_config in self.detectors:
138+
if detector_config.edge_inference_config not in self.edge_inference_configs:
139+
raise ValueError(f"Edge inference config '{detector_config.edge_inference_config}' not defined.")
140+
return self
141+
142+
def add_detector(self, detector: Union[str, Detector], edge_inference_config: InferenceConfig) -> None:
143+
"""Add a detector with the given inference config. Accepts detector ID or Detector object."""
144+
detector_id = detector.id if isinstance(detector, Detector) else detector
145+
if any(existing.detector_id == detector_id for existing in self.detectors):
146+
raise ValueError(f"A detector with ID '{detector_id}' already exists.")
147+
148+
existing = self.edge_inference_configs.get(edge_inference_config.name)
149+
if existing is None:
150+
self.edge_inference_configs[edge_inference_config.name] = edge_inference_config
151+
elif existing != edge_inference_config:
152+
raise ValueError(
153+
f"A different inference config named '{edge_inference_config.name}' is already registered."
154+
)
155+
156+
self.detectors.append(DetectorConfig(detector_id=detector_id, edge_inference_config=edge_inference_config.name))
157+
158+
def to_payload(self) -> dict[str, Any]:
159+
"""Return this config as a payload dictionary."""
160+
return self.model_dump()
161+
162+
163+
class DetectorsConfig(ConfigBase):
164+
"""
165+
Detector and inference-config mappings for edge inference.
166+
"""
167+
168+
169+
class EdgeEndpointConfig(ConfigBase):
170+
"""
171+
Top-level edge endpoint configuration.
172+
"""
173+
174+
global_config: GlobalConfig = Field(default_factory=GlobalConfig)
175+
176+
@classmethod
177+
def from_yaml(
178+
cls,
179+
filename: Optional[str] = None,
180+
yaml_str: Optional[str] = None,
181+
) -> "EdgeEndpointConfig":
182+
"""Create an EdgeEndpointConfig from a YAML filename or YAML string."""
183+
if filename is None and yaml_str is None:
184+
raise ValueError("Either filename or yaml_str must be provided.")
185+
if filename is not None and yaml_str is not None:
186+
raise ValueError("Only one of filename or yaml_str can be provided.")
187+
if filename is not None:
188+
if not filename.strip():
189+
raise ValueError("filename must be a non-empty path when provided.")
190+
with open(filename, "r") as f:
191+
yaml_str = f.read()
192+
193+
yaml_text = yaml_str or ""
194+
parsed = yaml.safe_load(yaml_text) or {}
195+
return cls.model_validate(parsed)
196+
197+
@classmethod
198+
def from_payload(cls, payload: dict[str, Any]) -> "EdgeEndpointConfig":
199+
"""Construct an EdgeEndpointConfig from a payload dictionary."""
200+
return cls.model_validate(payload)
201+
202+
203+
# Preset inference configs matching the standard edge-endpoint defaults.
204+
DEFAULT = InferenceConfig(name="default")
205+
EDGE_ANSWERS_WITH_ESCALATION = InferenceConfig(
206+
name="edge_answers_with_escalation",
207+
always_return_edge_prediction=True,
208+
min_time_between_escalations=2.0,
209+
)
210+
NO_CLOUD = InferenceConfig(
211+
name="no_cloud",
212+
always_return_edge_prediction=True,
213+
disable_cloud_escalation=True,
214+
)
215+
DISABLED = InferenceConfig(name="disabled", enabled=False)

0 commit comments

Comments
 (0)