Skip to content
Merged
Changes from all commits
Commits
File filter

Filter by extension

Filter by extension

Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
117 changes: 83 additions & 34 deletions apps/agentstack-sdk-py/README.md
Original file line number Diff line number Diff line change
@@ -1,45 +1,94 @@
# Agent Stack SDK
# Agent Stack Server SDK

## Examples
Python SDK for packaging agents for deployment to Agent Stack infrastructure.

The examples connect to the Agent Stack for LLM inteference.
[![PyPI version](https://img.shields.io/pypi/v/agentstack-sdk.svg?style=plastic)](https://pypi.org/project/agentstack-sdk/)
[![License](https://img.shields.io/badge/License-Apache_2.0-blue.svg?style=plastic)](https://opensource.org/licenses/Apache-2.0)
[![LF AI & Data](https://img.shields.io/badge/LF%20AI%20%26%20Data-0072C6?style=plastic&logo=linuxfoundation&logoColor=white)](https://lfaidata.foundation/projects/)

Run using:
## Overview

The `agentstack-sdk` provides Python utilities for wrapping agents built with any framework (LangChain, CrewAI, BeeAI Framework, etc.) for deployment on Agent Stack. It handles the A2A (Agent-to-Agent) protocol implementation, platform service integration, and runtime requirements so you can focus on agent logic.

## Key Features

- **Framework-Agnostic Deployment** - Wrap agents from any framework for Agent Stack deployment
- **A2A Protocol Support** - Automatic handling of Agent-to-Agent communication
- **Platform Service Integration** - Connect to Agent Stack's managed LLM, embedding, file storage, and vector store services
- **Context Storage** - Manage data associated with conversation contexts

## Installation

```bash
uv run examples/agent.py
uv add agentstack-sdk
```

## Quickstart

```python
import os

from a2a.types import (
Message,
)
from a2a.utils.message import get_message_text
from agentstack_sdk.server import Server
from agentstack_sdk.server.context import RunContext
from agentstack_sdk.a2a.types import AgentMessage

server = Server()

@server.agent()
async def example_agent(input: Message, context: RunContext):
"""Polite agent that greets the user"""
hello_template: str = os.getenv("HELLO_TEMPLATE", "Ciao %s!")
yield AgentMessage(text=hello_template % get_message_text(input))

def run():
try:
server.run(host=os.getenv("HOST", "127.0.0.1"), port=int(os.getenv("PORT", 8000)))
except KeyboardInterrupt:
pass


if __name__ == "__main__":
run()
```

Connect to the agent using the CLI:
Run the agent:

```bash
uv run examples/cli.py
uv run my_agent.py
```

## Plan

- `agentstack_sdk`
- `a2a`:
- `extensions`: Shared definitions for A2A extensions
- `services`: Dependency injection extensions for external services
- `llm`
- `embedding`
- `docling`
- `file_store`
- `vector_store`
- `ui`: User interface extensions for Agent Stack UI
- `trajectory`
- `citations`
- `history`: store and allow requesting the full history of the context
- `server`
- `context_storage`: store data associated with context_id
- `wrapper`: conveniently build A2A agents -- opinionated on how tasks work, `yield`-semantics, autowired
services
- `services`: clients for external services
- `llm`: OpenAI-compatible chat LLM
- `embedding`: OpenAI-compatible embedding
- `text_extraction`: Docling-compatible text extraction
- `file_store`: S3-compatible file storage
- `vector_store`: some vector store?
- `client`
- ?
## Available Extensions

The SDK includes extension support for:

- **Citations** - Source attribution (`CitationExtensionServer`, `CitationExtensionSpec`)
- **Trajectory** - Agent decision logging (`TrajectoryExtensionServer`, `TrajectoryExtensionSpec`)
- **Settings** - User-configurable agent parameters (`SettingsExtensionServer`, `SettingsExtensionSpec`)
- **LLM Services** - Platform-managed language models (`LLMServiceExtensionServer`, `LLMServiceExtensionSpec`)
- **Agent Details** - Metadata and UI enhancements (`AgentDetail`)
- **And more** - See [Documentation](https://agentstack.beeai.dev/stable/agent-development/overview)

Each extension provides both server-side handlers and A2A protocol specifications for seamless integration with Agent Stack's UI and infrastructure.

## Resources

- [Agent Stack Documentation](https://agentstack.beeai.dev)
- [GitHub Repository](https://github.com/i-am-bee/agentstack)
- [PyPI Package](https://pypi.org/project/agentstack-sdk/)

## Contributing

Contributions are welcome! Please see the [Contributing Guide](https://github.com/i-am-bee/agentstack/blob/main/CONTRIBUTING.md) for details.

## Support

- [GitHub Issues](https://github.com/i-am-bee/agentstack/issues)
- [GitHub Discussions](https://github.com/i-am-bee/agentstack/discussions)

---

Developed by contributors to the BeeAI project, this initiative is part of the [Linux Foundation AI & Data program](https://lfaidata.foundation/projects/). Its development follows open, collaborative, and community-driven practices.
Loading