Skip to content
Merged
Show file tree
Hide file tree
Changes from 10 commits
Commits
File filter

Filter by extension

Filter by extension


Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
28 changes: 28 additions & 0 deletions README.md
Original file line number Diff line number Diff line change
Expand Up @@ -86,6 +86,8 @@ cp examples/sgr_deep_research/config.yaml.example examples/sgr_deep_research/con

```bash
sgr --config-file examples/sgr_deep_research/config.yaml
# or use short option
sgr -c examples/sgr_deep_research/config.yaml
```

> **Note:** You can also run the server directly with Python:
Expand All @@ -94,6 +96,32 @@ sgr --config-file examples/sgr_deep_research/config.yaml
> python -m sgr_agent_core.server --config-file examples/sgr_deep_research/config.yaml
> ```

### Using the CLI Tool (`sgrsh`)

For interactive command-line usage, you can use the `sgrsh` utility:

```bash
# Single query mode
sgrsh "Найди цену биткоина"

# With agent selection (e.g. sgr_agent, dialog_agent)
sgrsh --agent sgr_agent "What is AI?"

# With custom config file
sgrsh -c config.yaml -a sgr_agent "Your query"

# Interactive chat mode (no query argument)
sgrsh
sgrsh -a sgr_agent
```

The `sgrsh` command:

- Automatically looks for `config.yaml` in the current directory
- Supports interactive chat mode for multiple queries
- Handles clarification and dialog (intermediate results) requests from agents
- Works with any agent defined in your configuration (e.g. `sgr_agent`, `dialog_agent`)

For more examples and detailed usage instructions, see the [examples/](examples/) directory.

## Benchmarking
Expand Down
57 changes: 54 additions & 3 deletions docs/en/getting-started/index.md
Original file line number Diff line number Diff line change
Expand Up @@ -57,10 +57,61 @@ pip install sgr-agent-core

See the [Installation Guide](installation.md) for detailed instructions and the [Using as Library](../framework/first-steps.md) guide to get started.

### Next Steps
### CLI Tool (`sgrsh`)

- **[Using as Library](../framework/first-steps.md)** — Learn how to use SGR Agent Core as a Python library
- **[API Server Quick Start](../sgr-api/SGR-Quick-Start.md)** — Get started with the REST API service
After installation, you can use the `sgrsh` command-line tool for interactive agent usage:

```bash
# Single query mode
sgrsh "Find the current Bitcoin price"

# With agent selection
sgrsh --agent sgr_agent "What is AI?"

# With custom config file
sgrsh -c config.yaml -a sgr_agent "Your query"

# Interactive chat mode (no query argument)
sgrsh
sgrsh -a sgr_agent
```

The `sgrsh` command:
- Automatically looks for `config.yaml` in the current directory
- Supports interactive chat mode for multiple queries
- Handles clarification requests from agents interactively
- Works with any agent defined in your configuration

### Using as Library

```python
import asyncio
from sgr_agent_core import AgentDefinition, AgentFactory
from sgr_agent_core.agents import SGRToolCallingAgent
import sgr_agent_core.tools as tools

async def main():
agent_def = AgentDefinition(
name="my_agent",
base_class=SGRToolCallingAgent,
tools=[tools.GeneratePlanTool, tools.FinalAnswerTool],
llm={
"api_key": "your-api-key",
"base_url": "https://api.openai.com/v1",
},
)

agent = await AgentFactory.create(
agent_def=agent_def,
task_messages=[{"role": "user", "content": "Research AI trends"}],
)

result = await agent.execute()
print(result)

if __name__ == "__main__":
asyncio.run(main())
```

## Documentation

Expand Down
9 changes: 8 additions & 1 deletion docs/en/getting-started/installation.md
Original file line number Diff line number Diff line change
Expand Up @@ -40,10 +40,17 @@ After installation, verify that the package is correctly installed:
python -c "import sgr_agent_core; print(sgr_agent_core.__version__)"
```

You should also be able to use the `sgr` command-line utility:
You should also be able to use the command-line utilities:

```bash
# API server command
sgr --help
# or with short option
sgr -c config.yaml

# Interactive CLI command
sgrsh --help
sgrsh "Your query here"
```

## Installation via Docker
Expand Down
1 change: 1 addition & 0 deletions docs/en/sgr-api/SGR-Summary-table.md
Original file line number Diff line number Diff line change
Expand Up @@ -5,3 +5,4 @@ This table compares the available agent types in SGR Agent Core, showing their i
| [SGRAgent](https://github.com/vamplabai/sgr-agent-core/blob/main/sgr_agent_core/agents/sgr_agent.py) | `sgr_agent` | Structured Output | ❌ Built into schema | 6 basic | 1 | SO Union Type |
| [ToolCallingAgent](https://github.com/vamplabai/sgr-agent-core/blob/main/sgr_agent_core/agents/tool_calling_agent.py) | `tool_calling_agent` | ❌ Absent | ❌ Absent | 6 basic | 1 | FC "required" |
| [SGRToolCallingAgent](https://github.com/vamplabai/sgr-agent-core/blob/main/sgr_agent_core/agents/sgr_tool_calling_agent.py) | `sgr_tool_calling_agent` | FC Tool enforced | ✅ First step FC | 7 (6 + ReasoningTool) | 2 | FC → FC TOP AGENT |
| [DialogAgent](https://github.com/vamplabai/sgr-agent-core/blob/main/sgr_agent_core/agents/dialog_agent.py) | `dialog_agent` | Same as SGRToolCallingAgent | ✅ First step FC | 8 (+ AnswerTool) | 2 | FC → FC Long dialogs |
57 changes: 54 additions & 3 deletions docs/ru/getting-started/index.md
Original file line number Diff line number Diff line change
Expand Up @@ -57,10 +57,61 @@ pip install sgr-agent-core

См. [Руководство по установке](installation.md) для подробных инструкций и [Использование как библиотека](../framework/first-steps.md) для начала работы.

### Следующие шаги
### CLI утилита (`sgrsh`)

- **[Использование как библиотека](../framework/first-steps.md)** — Узнайте, как использовать SGR Agent Core как Python библиотеку
- **[Быстрый старт API сервера](../sgr-api/SGR-Quick-Start.md)** — Начните работу с REST API сервисом
После установки вы можете использовать утилиту командной строки `sgrsh` для интерактивной работы с агентами:

```bash
# Режим одного запроса
sgrsh "Найди текущую цену биткоина"

# С выбором агента
sgrsh --agent sgr_agent "Что такое AI?"

# С указанием файла конфигурации
sgrsh -c config.yaml -a sgr_agent "Ваш запрос"

# Интерактивный режим чата (без аргумента запроса)
sgrsh
sgrsh -a sgr_agent
```

Команда `sgrsh`:
- Автоматически ищет `config.yaml` в текущей директории
- Поддерживает интерактивный режим чата для множественных запросов
- Обрабатывает запросы на уточнение от агентов интерактивно
- Работает с любым агентом, определённым в вашей конфигурации

### Использование как библиотека

```python
import asyncio
from sgr_agent_core import AgentDefinition, AgentFactory
from sgr_agent_core.agents import SGRToolCallingAgent
import sgr_agent_core.tools as tools

async def main():
agent_def = AgentDefinition(
name="my_agent",
base_class=SGRToolCallingAgent,
tools=[tools.GeneratePlanTool, tools.FinalAnswerTool],
llm={
"api_key": "your-api-key",
"base_url": "https://api.openai.com/v1",
},
)

agent = await AgentFactory.create(
agent_def=agent_def,
task_messages=[{"role": "user", "content": "Исследуй тренды в AI"}],
)

result = await agent.execute()
print(result)

if __name__ == "__main__":
asyncio.run(main())
```

## Документация

Expand Down
9 changes: 8 additions & 1 deletion docs/ru/getting-started/installation.md
Original file line number Diff line number Diff line change
Expand Up @@ -40,10 +40,17 @@ pip install sgr-agent-core[docs]
python -c "import sgr_agent_core; print(sgr_agent_core.__version__)"
```

Также вы должны иметь возможность использовать утилиту командной строки `sgr`:
Также вы должны иметь возможность использовать утилиты командной строки:

```bash
# Команда API сервера
sgr --help
# или с коротким параметром
sgr -c config.yaml

# Интерактивная CLI команда
sgrsh --help
sgrsh "Ваш запрос здесь"
```

## Установка через Docker
Expand Down
1 change: 1 addition & 0 deletions docs/ru/sgr-api/SGR-Summary-table.md
Original file line number Diff line number Diff line change
Expand Up @@ -5,3 +5,4 @@
| [SGRAgent](https://github.com/vamplabai/sgr-agent-core/blob/main/sgr_agent_core/agents/sgr_agent.py) | `sgr_agent` | Structured Output | ❌ Встроен в схему | 6 базовых | 1 | SO Union Type |
| [ToolCallingAgent](https://github.com/vamplabai/sgr-agent-core/blob/main/sgr_agent_core/agents/tool_calling_agent.py) | `tool_calling_agent` | ❌ Отсутствует | ❌ Отсутствует | 6 базовых | 1 | FC "required" |
| [SGRToolCallingAgent](https://github.com/vamplabai/sgr-agent-core/blob/main/sgr_agent_core/agents/sgr_tool_calling_agent.py) | `sgr_tool_calling_agent` | FC Tool принудительно | ✅ Первый шаг FC | 7 (6 + ReasoningTool) | 2 | FC → FC ЛУЧШИЙ АГЕНТ |
| [DialogAgent](https://github.com/vamplabai/sgr-agent-core/blob/main/sgr_agent_core/agents/dialog_agent.py) | `dialog_agent` | Как SGRToolCallingAgent | ✅ Первый шаг FC | 8 (+ AnswerTool) | 2 | FC → FC Длинные диалоги |
1 change: 1 addition & 0 deletions examples/sgr_deep_research/README.md
Original file line number Diff line number Diff line change
Expand Up @@ -9,6 +9,7 @@ SGR Deep Research contains research agent definitions and configuration files fo
- **SGR Agent** - Schema-Guided Reasoning agent for structured research
- **Tool Calling Agent** - Function calling agent for research tasks
- **SGR Tool Calling Agent** - Hybrid SGR + function calling agent
- **Dialog Agent** - Dialog agent with intermediate results and long conversations

All agents include:

Expand Down
24 changes: 20 additions & 4 deletions examples/sgr_deep_research/config.yaml.example
Original file line number Diff line number Diff line change
Expand Up @@ -5,7 +5,7 @@
llm:
api_key: "your-openai-api-key-here" # Your OpenAI API key
base_url: "https://api.openai.com/v1" # API base URL
model: "gpt-4o-mini" # Model name
model: "gpt-4.1-mini" # Model name
max_tokens: 8000 # Max output tokens
temperature: 0.4 # Temperature (0.0-1.0)
# proxy: "socks5://127.0.0.1:1081" # Optional proxy (socks5:// or http://)
Expand Down Expand Up @@ -51,14 +51,16 @@ tools:
# base_class defaults to sgr_agent_core.tools.AdaptPlanTool
reasoning_tool:
# base_class defaults to sgr_agent_core.tools.ReasoningTool
answer_tool:
# base_class defaults to sgr_agent_core.tools.AnswerTool

# Agent Definitions
agents:
# SGR Agent for research
sgr_agent:
base_class: "agents.ResearchSGRAgent"
llm:
model: "gpt-4o-mini"
model: "gpt-4.1-mini"
temperature: 0.4
tools:
- "web_search_tool"
Expand All @@ -73,7 +75,7 @@ agents:
tool_calling_agent:
base_class: "agents.ResearchToolCallingAgent"
llm:
model: "gpt-4o-mini"
model: "gpt-4.1-mini"
temperature: 0.4
tools:
- "web_search_tool"
Expand All @@ -88,7 +90,7 @@ agents:
sgr_tool_calling_agent:
base_class: "agents.ResearchSGRToolCallingAgent"
llm:
model: "gpt-4o-mini"
model: "gpt-4.1-mini"
temperature: 0.4
tools:
- "web_search_tool"
Expand All @@ -99,3 +101,17 @@ agents:
- "reasoning_tool"
- "generate_plan_tool"
- "adapt_plan_tool"

# Dialog Agent for research (intermediate results, long conversations)
dialog_agent:
base_class: "agents.ResearchDialogAgent"
llm:
model: "gpt-4.1-mini"
temperature: 0.4
tools:
- "web_search_tool"
- "extract_page_content_tool"
- "reasoning_tool"
- "answer_tool"
- "generate_plan_tool"
- "adapt_plan_tool"
7 changes: 7 additions & 0 deletions examples/sgr_deep_research/definitions.py
Original file line number Diff line number Diff line change
Expand Up @@ -8,6 +8,7 @@

import sgr_agent_core.tools as tools
from examples.sgr_deep_research.agents import (
ResearchDialogAgent,
ResearchSGRAgent,
ResearchSGRToolCallingAgent,
ResearchToolCallingAgent,
Expand Down Expand Up @@ -50,5 +51,11 @@ def get_research_agents_definitions() -> dict[str, AgentDefinition]:
tools=DEFAULT_TOOLKIT,
prompts=PromptsConfig(system_prompt_file=Path("sgr_agent_core/prompts/research_system_prompt.txt")),
),
AgentDefinition(
name="research_dialog_agent",
base_class=ResearchDialogAgent,
tools=DEFAULT_TOOLKIT,
prompts=PromptsConfig(system_prompt_file=Path("sgr_agent_core/prompts/research_system_prompt.txt")),
),
]
return {agent.name: agent for agent in agents}
1 change: 1 addition & 0 deletions pyproject.toml
Original file line number Diff line number Diff line change
Expand Up @@ -63,6 +63,7 @@ Documentation = "https://vamplabai.github.io/sgr-agent-core/"

[project.scripts]
sgr = "sgr_agent_core.server.__main__:main"
sgrsh = "sgr_agent_core.cli.__main__:main"

[project.optional-dependencies]
dev = [
Expand Down
2 changes: 2 additions & 0 deletions sgr_agent_core/agents/__init__.py
Original file line number Diff line number Diff line change
@@ -1,10 +1,12 @@
"""Agents module for SGR Agent Core."""

from sgr_agent_core.agents.dialog_agent import DialogAgent
from sgr_agent_core.agents.sgr_agent import SGRAgent
from sgr_agent_core.agents.sgr_tool_calling_agent import SGRToolCallingAgent
from sgr_agent_core.agents.tool_calling_agent import ToolCallingAgent

__all__ = [
"DialogAgent",
"SGRAgent",
"SGRToolCallingAgent",
"ToolCallingAgent",
Expand Down
52 changes: 52 additions & 0 deletions sgr_agent_core/agents/dialog_agent.py
Original file line number Diff line number Diff line change
@@ -0,0 +1,52 @@
"""Dialog agent for long-running conversations with intermediate results."""

from typing import Type

from openai import AsyncOpenAI

from sgr_agent_core.agent_definition import AgentConfig
from sgr_agent_core.agents.sgr_tool_calling_agent import SGRToolCallingAgent
from sgr_agent_core.models import AgentStatesEnum
from sgr_agent_core.tools import AnswerTool, BaseTool


class DialogAgent(SGRToolCallingAgent):
"""Agent specialized for dialog interactions with intermediate results.

Uses AnswerTool to share intermediate results and maintain
conversation flow, keeping the agent available for further
interactions. Supports long dialogs with full conversation history.
"""

name: str = "dialog_agent"

def __init__(
self,
task_messages: list,
openai_client: AsyncOpenAI,
agent_config: AgentConfig,
toolkit: list[Type[BaseTool]],
def_name: str | None = None,
**kwargs: dict,
):
# Ensure AnswerTool is in toolkit for dialog flow; keep tools from config/registry
answer_toolkit = [AnswerTool]
merged_toolkit = answer_toolkit + [t for t in toolkit if t is not AnswerTool]
super().__init__(
task_messages=task_messages,
openai_client=openai_client,
agent_config=agent_config,
toolkit=merged_toolkit,
def_name=def_name,
**kwargs,
)

async def _after_action_phase(self, action_tool: BaseTool, result: str) -> None:
"""Wait for user response when AnswerTool was used."""
await super()._after_action_phase(action_tool, result)
if isinstance(action_tool, AnswerTool):
self.logger.info("\n💬 Dialog shared - agent waiting for response")
self._context.state = AgentStatesEnum.WAITING_FOR_CLARIFICATION
self.streaming_generator.finish(result)
self._context.clarification_received.clear()
await self._context.clarification_received.wait()
Loading