Skip to content
Merged
Show file tree
Hide file tree
Changes from all commits
Commits
File filter

Filter by extension

Filter by extension

Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
54 changes: 40 additions & 14 deletions README.md
Original file line number Diff line number Diff line change
Expand Up @@ -6,8 +6,6 @@ Open-source agentic framework for building intelligent research agents using Sch

The library includes extensible tools for search, reasoning, and clarification, real-time streaming responses, OpenAI-compatible REST API. Works with any OpenAI-compatible LLM, including local models for fully private research.

______________________________________________________________________

## Why use SGR Agent Core?

- **Schema-Guided Reasoning** — SGR combines structured reasoning with flexible tool selection
Expand All @@ -17,8 +15,6 @@ ______________________________________________________________________
- **Real-time Streaming** — Built-in support for streaming responses via SSE
- **Production Ready** — Battle-tested with comprehensive test coverage and Docker support

______________________________________________________________________

## Documentation

> **Get started quickly with our documentation:**
Expand All @@ -27,25 +23,61 @@ ______________________________________________________________________
- **[Framework Quick Start Guide](https://vamplabai.github.io/sgr-agent-core/framework/first-steps/)** - Get up and running in minutes
- **[DeepSearch Service Documentation](https://vamplabai.github.io/sgr-agent-core/sgr-api/SGR-Quick-Start/)** - REST API reference with examples

______________________________________________________________________

## Quick Start

### Running with Docker

The fastest way to get started is using Docker:

```bash
# Clone the repository
git clone https://github.com/vamplabai/sgr-agent-core.git
cd sgr-agent-core

# Create directories with write permissions for all
sudo mkdir -p logs reports
sudo chmod 777 logs reports

# Copy and edit the configuration file
cp examples/sgr_deep_research/config.yaml.example examples/sgr_deep_research/config.yaml
# Edit examples/sgr_deep_research/config.yaml and set your API keys:
# - llm.api_key: Your OpenAI API key
# - search.tavily_api_key: Your Tavily API key (optional)

# Run the container
docker run --rm -i \
--name sgr-agent \
-p 8010:8010 \
-v $(pwd)/examples/sgr_deep_research:/app/examples/sgr_deep_research:ro \
-v $(pwd)/logs:/app/logs \
-v $(pwd)/reports:/app/reports \
ghcr.io/vamplabai/sgr-agent-core:latest \
--config-file /app/examples/sgr_deep_research/config.yaml \
--host 0.0.0.0 \
--port 8010
```

The API server will be available at `http://localhost:8010` with OpenAI-compatible API endpoints. Interactive API documentation (Swagger UI) is available at `http://localhost:8010/docs`.

### Installation

If you want to use SGR Agent Core as a Python library (framework):

```bash
pip install sgr-agent-core
```

See the [Installation Guide](https://vamplabai.github.io/sgr-agent-core/getting-started/installation/) for detailed instructions and the [Using as Library](https://vamplabai.github.io/sgr-agent-core/framework/first-steps/) guide to get started.

### Running Research Agents

The project includes example research agent configurations in the `examples/` directory. To get started with deep research agents:

1. Copy and configure the config file:

```bash
cp examples/sgr_deep_research/config.yaml my_config.yaml
# Edit my_config.yaml and set your API keys:
cp examples/sgr_deep_research/config.yaml.example examples/sgr_deep_research/config.yaml
# Edit examples/sgr_deep_research/config.yaml and set your API keys:
# - llm.api_key: Your OpenAI API key
# - search.tavily_api_key: Your Tavily API key (optional)
```
Expand All @@ -56,8 +88,6 @@ cp examples/sgr_deep_research/config.yaml my_config.yaml
sgr --config-file examples/sgr_deep_research/config.yaml
```

The server will start on `http://localhost:8010` with OpenAI-compatible API endpoints.

> **Note:** You can also run the server directly with Python:
>
> ```bash
Expand All @@ -66,8 +96,6 @@ The server will start on `http://localhost:8010` with OpenAI-compatible API endp

For more examples and detailed usage instructions, see the [examples/](examples/) directory.

______________________________________________________________________

## Benchmarking

![SimpleQA Benchmark Comparison](https://github.com/vamplabAI/sgr-agent-core/blob/main/docs/assets/images/simpleqa_benchmark_comparison.png)
Expand All @@ -81,8 +109,6 @@ ______________________________________________________________________

More detailed benchmark results are available [here](https://github.com/vamplabAI/sgr-agent-core/blob/main/benchmark/simpleqa_benchmark_results.md).

______________________________________________________________________

## Open-Source Development Team

*All development is driven by pure enthusiasm and open-source community collaboration. We welcome contributors of all skill levels!*
Expand Down
2 changes: 1 addition & 1 deletion docs/en/framework/first-steps.md
Original file line number Diff line number Diff line change
@@ -1,4 +1,4 @@
# Quick Start
# Using as Library

This guide will help you quickly get started with SGR Agent Core as a Python library.

Expand Down
77 changes: 39 additions & 38 deletions docs/en/getting-started/index.md
Original file line number Diff line number Diff line change
Expand Up @@ -15,58 +15,59 @@ The library includes extensible tools for search, reasoning, and clarification,

## Quick Start

### Installation
### Running with Docker

Install SGR Agent Core via pip:
The fastest way to get started is using Docker:

```bash
pip install sgr-agent-core
# Clone the repository
git clone https://github.com/vamplabai/sgr-agent-core.git
cd sgr-agent-core

# Create directories with write permissions for all
sudo mkdir -p logs reports
sudo chmod 777 logs reports

# Copy and edit the configuration file
cp examples/sgr_deep_research/config.yaml.example examples/sgr_deep_research/config.yaml
# Edit examples/sgr_deep_research/config.yaml and set your API keys

# Run the container
docker run --rm -i \
--name sgr-agent \
-p 8010:8010 \
-v $(pwd)/examples/sgr_deep_research:/app/examples/sgr_deep_research:ro \
-v $(pwd)/logs:/app/logs \
-v $(pwd)/reports:/app/reports \
ghcr.io/vamplabai/sgr-agent-core:latest \
--config-file /app/examples/sgr_deep_research/config.yaml \
--host 0.0.0.0 \
--port 8010
```

Or use Docker:
The API server will be available at `http://localhost:8010`. Interactive API documentation (Swagger UI) is available at `http://localhost:8010/docs`.

### Installation

If you want to use SGR Agent Core as a Python library (framework):

```bash
docker pull ghcr.io/vamplabai/sgr-agent-core:latest
pip install sgr-agent-core
```

See the [Installation Guide](installation.md) for detailed instructions.

### Quick Example

```python
import asyncio
from sgr_agent_core import AgentDefinition, AgentFactory
from sgr_agent_core.agents import SGRToolCallingAgent
import sgr_agent_core.tools as tools

async def main():
agent_def = AgentDefinition(
name="my_agent",
base_class=SGRToolCallingAgent,
tools=[tools.GeneratePlanTool, tools.FinalAnswerTool],
llm={
"api_key": "your-api-key",
"base_url": "https://api.openai.com/v1",
},
)

agent = await AgentFactory.create(
agent_def=agent_def,
task_messages=[{"role": "user", "content": "Research AI trends"}],
)

result = await agent.execute()
print(result)

if __name__ == "__main__":
asyncio.run(main())
```
See the [Installation Guide](installation.md) for detailed instructions and the [Using as Library](../framework/first-steps.md) guide to get started.

### Next Steps

- **[Using as Library](../framework/first-steps.md)** — Learn how to use SGR Agent Core as a Python library
- **[API Server Quick Start](../sgr-api/SGR-Quick-Start.md)** — Get started with the REST API service

## Documentation

- **[Installation](installation.md)** — Detailed installation instructions for pip and Docker
- **[Agent Core Framework](../framework/main-concepts.md)** — Understand the core concepts and architecture
- **[SGR API Service](../sgr-api/SGR-Quick-Start.md)** — Get started with the REST API service
- **[Using as Library](../framework/first-steps.md)** — Learn how to use SGR Agent Core as a Python library
- **[API Server Quick Start](../sgr-api/SGR-Quick-Start.md)** — Get started with the REST API service

## Contact & Community

Expand Down
2 changes: 1 addition & 1 deletion docs/en/getting-started/installation.md
Original file line number Diff line number Diff line change
Expand Up @@ -114,7 +114,7 @@ services:
docker-compose up -d
```

The API server will be available at `http://localhost:8010`.
The API server will be available at `http://localhost:8010`. Interactive API documentation (Swagger UI) is available at `http://localhost:8010/docs`.

### Building from Source

Expand Down
Loading