Skip to content
Open
Show file tree
Hide file tree
Changes from all commits
Commits
File filter

Filter by extension

Filter by extension


Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
10 changes: 10 additions & 0 deletions .env.example
Original file line number Diff line number Diff line change
Expand Up @@ -50,3 +50,13 @@ SGR__PROMPTS__CLARIFICATION_RESPONSE_FILE=path/to/your/clarification_response.tx
# =======================================================
# Note: MCP configuration is complex and better suited for config.yaml
# See config.yaml.example for MCP server configuration examples

# =======================================================
# Observability: Langfuse integration
# =======================================================
SGR__LANGFUSE__ENABLED=false
# SGR__LANGFUSE__PUBLIC_KEY=pk-lf-xxx
# SGR__LANGFUSE__SECRET_KEY=sk-lf-xxx
# SGR__LANGFUSE__HOST=http://localhost:3000

# Shorthand (credentials via LANGFUSE_* env vars): SGR__LANGFUSE=true
9 changes: 9 additions & 0 deletions config.yaml.example
Original file line number Diff line number Diff line change
Expand Up @@ -10,6 +10,15 @@ llm:
temperature: 0.4 # Temperature (0.0-1.0)
# proxy: "socks5://127.0.0.1:1081" # Optional proxy (socks5:// or http://)

# Observability (Langfuse)
# When enabled, AgentFactory will create Langfuse AsyncOpenAI client instead of standard AsyncOpenAI.
# Credentials can be set here or via LANGFUSE_PUBLIC_KEY / LANGFUSE_SECRET_KEY / LANGFUSE_HOST env vars.
langfuse:
enabled: false
# public_key: "pk-lf-xxx"
# secret_key: "sk-lf-xxx"
# host: "http://localhost:3000"

# Execution Settings
execution:
max_clarifications: 3 # Max clarification requests
Expand Down
37 changes: 37 additions & 0 deletions docs/en/framework/configuration.md
Original file line number Diff line number Diff line change
Expand Up @@ -67,6 +67,43 @@ config = GlobalConfig.from_yaml("config.yaml")

An example can be found in [`config.yaml.example`](https://github.com/vamplabAI/sgr-agent-core/blob/main/config.yaml.example).

### Observability and Langfuse integration

SGR Agent Core can optionally integrate with [Langfuse](https://langfuse.com) for tracing LLM calls.
Configuration is done via the `langfuse` section in `config.yaml`:

```yaml
langfuse:
enabled: true
public_key: "pk-lf-..."
secret_key: "sk-lf-..."
host: "http://localhost:3000" # omit for cloud.langfuse.com
```

The same settings can be passed via environment variables:

```bash
SGR__LANGFUSE__ENABLED=true
SGR__LANGFUSE__PUBLIC_KEY=pk-lf-xxx
SGR__LANGFUSE__SECRET_KEY=sk-lf-xxx
SGR__LANGFUSE__HOST=http://localhost:3000
```

**Shorthand** (when credentials are already in `LANGFUSE_*` env vars):

```yaml
langfuse: true
```

When `langfuse.enabled` is `true`, `AgentFactory` creates the client using
`langfuse.openai.AsyncOpenAI` (drop-in replacement for the standard OpenAI client).
If `public_key`/`secret_key`/`host` are provided in config, Langfuse is initialized
with those credentials explicitly. Otherwise the Langfuse SDK falls back to reading
`LANGFUSE_PUBLIC_KEY`, `LANGFUSE_SECRET_KEY`, and `LANGFUSE_HOST` from the environment.

If the `langfuse` package is not installed, the system logs a warning and falls back
to the standard `openai.AsyncOpenAI` client.

### Parameter Override

**Key Feature:** `AgentDefinition` inherits all parameters from `GlobalConfig` and overrides only those explicitly specified. This allows creating minimal configurations by specifying only necessary changes.
Expand Down
36 changes: 36 additions & 0 deletions docs/ru/framework/configuration.md
Original file line number Diff line number Diff line change
Expand Up @@ -68,6 +68,42 @@ config = GlobalConfig.from_yaml("config.yaml")
Пример можно найти в [`config.yaml.example`](https://github.com/vamplabAI/sgr-agent-core/blob/main/config.yaml.example).


### Наблюдаемость и интеграция с Langfuse

SGR Agent Core может опционально интегрироваться с [Langfuse](https://langfuse.com) для трассировки вызовов LLM.
Настройка производится через секцию `langfuse` в `config.yaml`:

```yaml
langfuse:
enabled: true
public_key: "pk-lf-..."
secret_key: "sk-lf-..."
host: "http://localhost:3000" # опустите для cloud.langfuse.com
```

Те же параметры можно задать через переменные окружения:

```bash
SGR__LANGFUSE__ENABLED=true
SGR__LANGFUSE__PUBLIC_KEY=pk-lf-xxx
SGR__LANGFUSE__SECRET_KEY=sk-lf-xxx
SGR__LANGFUSE__HOST=http://localhost:3000
```

**Сокращённая форма** (когда ключи уже заданы в `LANGFUSE_*` переменных окружения):

```yaml
langfuse: true
```

Когда `langfuse.enabled` установлен в `true`, `AgentFactory` создаёт клиент на основе
`langfuse.openai.AsyncOpenAI` (drop-in замена стандартного клиента OpenAI).
Если в конфиге указаны `public_key`/`secret_key`/`host`, Langfuse инициализируется
с этими учётными данными явно. В противном случае SDK Langfuse самостоятельно
читает `LANGFUSE_PUBLIC_KEY`, `LANGFUSE_SECRET_KEY` и `LANGFUSE_HOST` из окружения.

Если пакет `langfuse` не установлен, система пишет предупреждение в лог и
откатывается к стандартному клиенту `openai.AsyncOpenAI`.

### Переопределение параметров

Expand Down
2 changes: 2 additions & 0 deletions pyproject.toml
Original file line number Diff line number Diff line change
Expand Up @@ -48,6 +48,8 @@ dependencies = [
"jambo>=0.1.3.post2",
# Tools filtering
"rank-bm25>=0.2.2",
# Observability
"langfuse>=4.0.0",
]

[project.urls]
Expand Down
24 changes: 24 additions & 0 deletions sgr_agent_core/agent_definition.py
Original file line number Diff line number Diff line change
Expand Up @@ -144,18 +144,42 @@ class ExecutionConfig(BaseModel, extra="allow"):
reports_dir: str = Field(default="reports", description="Directory for saving reports")


class LangfuseConfig(BaseModel):
"""Langfuse observability configuration."""

enabled: bool = Field(default=False, description="Enable Langfuse integration")
public_key: str | None = Field(default=None, description="Langfuse public key (pk-lf-...)")
secret_key: str | None = Field(default=None, description="Langfuse secret key (sk-lf-...)")
host: str | None = Field(default=None, description="Langfuse host URL (e.g. http://localhost:3000)")


class AgentConfig(BaseModel, extra="allow"):
"""Agent configuration with all settings.

The 'extra="allow"' allows additional fields for agent-specific
parameters (e.g., working_directory for file agents).
"""

langfuse: LangfuseConfig = Field(default_factory=LangfuseConfig, description="Langfuse observability settings")
llm: LLMConfig = Field(default_factory=LLMConfig, description="LLM settings")
execution: ExecutionConfig = Field(default_factory=ExecutionConfig, description="Execution settings")
prompts: PromptsConfig = Field(default_factory=PromptsConfig, description="Prompts settings")
mcp: MCPConfig = Field(default_factory=MCPConfig, description="MCP settings")

@field_validator("langfuse", mode="before")
@classmethod
def normalize_langfuse(cls, v):
"""Accept bool shorthand: langfuse: true → LangfuseConfig(enabled=True)."""
if isinstance(v, bool):
return {"enabled": v}
if isinstance(v, str):
return {"enabled": v.lower() in ("true", "1", "yes")}
return v

@property
def langfuse_enabled(self) -> bool:
return self.langfuse.enabled
Comment on lines +179 to +181
Copy link
Copy Markdown
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Выглядит излишним



class ToolDefinition(BaseModel, extra="allow"):
"""Definition of a custom tool.
Expand Down
65 changes: 65 additions & 0 deletions sgr_agent_core/agent_factory.py
Original file line number Diff line number Diff line change
@@ -1,6 +1,8 @@
"""Agent Factory for dynamic agent creation from definitions."""

import inspect
import logging
from importlib import import_module
from typing import Any, Type, TypeVar

import httpx
Expand All @@ -26,6 +28,44 @@ class AgentFactory:
and create instances with the appropriate configuration.
"""

@classmethod
def _patch_langfuse_stream_close(cls) -> None:
"""Patch OpenAI streaming implementation for Langfuse compatibility.

Langfuse wraps the underlying HTTP response so that it exposes
``close()`` instead of ``aclose()``. OpenAI's
``AsyncChatCompletionStream.close`` calls
``self._response.aclose()`` which raises ``AttributeError``.

This patch replaces ``AsyncChatCompletionStream.close`` with a
version that prefers ``aclose()`` when available and falls back
to ``close()``, awaiting the result if necessary.
"""
try:
from openai.lib.streaming.chat._completions import AsyncChatCompletionStream
except Exception: # pragma: no cover
logger.warning("Failed to import OpenAI chat streaming module for Langfuse patch")
return

if getattr(AsyncChatCompletionStream.close, "_langfuse_patched", False):
return

async def safe_close(self) -> None: # type: ignore[override]
response = getattr(self, "_response", None)
if response is None:
return

close_method = getattr(response, "aclose", None) or getattr(response, "close", None)
if close_method is None:
return

result = close_method()
if inspect.isawaitable(result):
await result

safe_close._langfuse_patched = True # type: ignore[attr-defined]
AsyncChatCompletionStream.close = safe_close # type: ignore[assignment]

@classmethod
def _create_client(cls, llm_config: LLMConfig) -> AsyncOpenAI:
"""Create OpenAI client from configuration.
Expand All @@ -36,10 +76,35 @@ def _create_client(cls, llm_config: LLMConfig) -> AsyncOpenAI:
Returns:
Configured AsyncOpenAI client
"""
config = GlobalConfig()
client_kwargs = {"base_url": llm_config.base_url, "api_key": llm_config.api_key}
if llm_config.proxy:
client_kwargs["http_client"] = httpx.AsyncClient(proxy=llm_config.proxy)

if getattr(config, "langfuse_enabled", False):
try:
lf_cfg = config.langfuse
if lf_cfg.public_key or lf_cfg.secret_key or lf_cfg.host:
LangfuseClient = getattr(import_module("langfuse"), "Langfuse")
kwargs = {}
if lf_cfg.public_key:
kwargs["public_key"] = lf_cfg.public_key
if lf_cfg.secret_key:
kwargs["secret_key"] = lf_cfg.secret_key
if lf_cfg.host:
kwargs["host"] = lf_cfg.host
LangfuseClient(**kwargs)
logger.info("Langfuse initialized with explicit credentials from config")
LangfuseAsyncOpenAI = getattr(import_module("langfuse.openai"), "AsyncOpenAI")
cls._patch_langfuse_stream_close()
logger.info("Creating Langfuse AsyncOpenAI client (langfuse_enabled=True)")
return LangfuseAsyncOpenAI(**client_kwargs)
except ImportError:
logger.warning(
"Langfuse is enabled but 'langfuse' package is not available. "
"Falling back to standard AsyncOpenAI client."
)

Comment on lines +84 to +107
Copy link
Copy Markdown
Member

@virrius virrius Mar 30, 2026

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Что-то не очень здоровое.

  1. Валидацию можно вынести в модельку
  2. Зачем-то собираются лишние kwargs с лишними if
  3. Если юзер не заимпортил модуль, лучше явно упасть чем неявно фолбекнуться
  4. Какую проблему решает патч stream close?

прогонял на этом тесте, вроде всё работает

    from sgr_agent_core.agent_config import GlobalConfig
    config = GlobalConfig().from_yaml("config.yaml")
    from langfuse import Langfuse
    Langfuse(
        public_key=config.langfuse.public_key,
        secret_key=config.langfuse.secret_key,
        host=config.langfuse.host,
    )
    from langfuse.openai import AsyncOpenAI
    async_client = AsyncOpenAI(
        base_url=config.llm.base_url,
        api_key=config.llm.api_key,
    )
    completion = await async_client.chat.completions.create(
      name="test-chat",
      model="gpt-4o",
      messages=[
          {"role": "system", "content": "Tell me about dirigables"},
          {"role": "user", "content": "Tell me about dirigables"}],
      temperature=0,
      metadata={"someMetadataKey": "someValue"},
      stream=True
    )
    async for chunk in completion:
        print(chunk.choices[0].delta.content, end="")
    print(Langfuse)
    print(completion)

Copy link
Copy Markdown
Collaborator Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

  1. через kwargs собирается, потому что Langfuse с пустыми параметрами ( когда kwargs будут пустые) инициализируется с env переменными LANGFUSE_SECRET_KEY ,
    LANGFUSE_PUBLIC_KEY ,
    LANGFUSE_BASE_URL .

Copy link
Copy Markdown
Member

@virrius virrius Mar 30, 2026

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

  1. У нас уже есть готовая pydantic моделька, которая в себе это хранит, валидирует и всё такое. Зачем возвращаться обратно к raw словарю?

return AsyncOpenAI(**client_kwargs)

@classmethod
Expand Down
102 changes: 102 additions & 0 deletions tests/test_agent_config_integration.py
Original file line number Diff line number Diff line change
Expand Up @@ -77,6 +77,108 @@ def test_invalid_config_values(self):
assert agent.task_messages[0]["content"] == "Invalid config test"


class TestLangfuseConfiguration:
"""Tests for Langfuse-related configuration flags."""

def test_langfuse_enabled_default_false(self, monkeypatch):
"""Test that langfuse_enabled is False by default."""
from sgr_agent_core import agent_config as agent_config_module

agent_config_module.GlobalConfig._instance = None
agent_config_module.GlobalConfig._initialized = False
monkeypatch.delenv("SGR__LANGFUSE", raising=False)
monkeypatch.delenv("SGR__LANGFUSE__ENABLED", raising=False)

config = GlobalConfig()

assert hasattr(config, "langfuse_enabled")
assert config.langfuse_enabled is False

def test_langfuse_enabled_from_env(self, monkeypatch):
"""Test that langfuse_enabled can be enabled via nested env
variable."""
from sgr_agent_core import agent_config as agent_config_module

agent_config_module.GlobalConfig._instance = None
agent_config_module.GlobalConfig._initialized = False

monkeypatch.setenv("SGR__LANGFUSE__ENABLED", "true")

config = GlobalConfig()

assert config.langfuse_enabled is True

def test_langfuse_enabled_from_env_shorthand(self, monkeypatch):
"""Test backward-compat: SGR__LANGFUSE=true still works."""
from sgr_agent_core import agent_config as agent_config_module

agent_config_module.GlobalConfig._instance = None
agent_config_module.GlobalConfig._initialized = False
monkeypatch.delenv("SGR__LANGFUSE__ENABLED", raising=False)

monkeypatch.setenv("SGR__LANGFUSE", "true")

config = GlobalConfig()

assert config.langfuse_enabled is True

def test_langfuse_enabled_from_yaml(self, tmp_path, monkeypatch):
"""Test that langfuse_enabled can be enabled via nested config.yaml."""
from sgr_agent_core import agent_config as agent_config_module

agent_config_module.GlobalConfig._instance = None
agent_config_module.GlobalConfig._initialized = False

config_path = tmp_path / "config.yaml"
config_path.write_text(
"\n".join(
[
"llm:",
' api_key: "test-key"',
' base_url: "https://api.openai.com/v1"',
"langfuse:",
" enabled: true",
]
),
encoding="utf-8",
)

config = GlobalConfig.from_yaml(str(config_path))

assert config.langfuse_enabled is True

def test_langfuse_yaml_with_credentials(self, tmp_path, monkeypatch):
"""Test that Langfuse credentials are parsed from config.yaml."""
from sgr_agent_core import agent_config as agent_config_module

agent_config_module.GlobalConfig._instance = None
agent_config_module.GlobalConfig._initialized = False

config_path = tmp_path / "config.yaml"
config_path.write_text(
"\n".join(
[
"llm:",
' api_key: "test-key"',
' base_url: "https://api.openai.com/v1"',
"langfuse:",
" enabled: true",
' public_key: "pk-lf-test"',
' secret_key: "sk-lf-test"',
' host: "http://localhost:3000"',
]
),
encoding="utf-8",
)

config = GlobalConfig.from_yaml(str(config_path))

assert config.langfuse_enabled is True
assert config.langfuse.public_key == "pk-lf-test"
assert config.langfuse.secret_key == "sk-lf-test"
assert config.langfuse.host == "http://localhost:3000"


class TestMultipleAgentConfigurationConsistency:
"""Tests for configuration consistency across multiple agents."""

Expand Down
Loading
Loading