|
1 | | -# Agent Stack SDK |
| 1 | +# Agent Stack Server SDK |
2 | 2 |
|
3 | | -## Examples |
| 3 | +Python SDK for packaging agents for deployment to Agent Stack infrastructure. |
4 | 4 |
|
5 | | -The examples connect to the Agent Stack for LLM inteference. |
| 5 | +[](https://pypi.org/project/agentstack-sdk/) |
| 6 | +[](https://opensource.org/licenses/Apache-2.0) |
| 7 | +[](https://lfaidata.foundation/projects/) |
6 | 8 |
|
7 | | -Run using: |
| 9 | +## Overview |
| 10 | + |
| 11 | +The `agentstack-sdk` provides Python utilities for wrapping agents built with any framework (LangChain, CrewAI, BeeAI Framework, etc.) for deployment on Agent Stack. It handles the A2A (Agent-to-Agent) protocol implementation, platform service integration, and runtime requirements so you can focus on agent logic. |
| 12 | + |
| 13 | +## Key Features |
| 14 | + |
| 15 | +- **Framework-Agnostic Deployment** - Wrap agents from any framework for Agent Stack deployment |
| 16 | +- **A2A Protocol Support** - Automatic handling of Agent-to-Agent communication |
| 17 | +- **Platform Service Integration** - Connect to Agent Stack's managed LLM, embedding, file storage, and vector store services |
| 18 | +- **Context Storage** - Manage data associated with conversation contexts |
| 19 | + |
| 20 | +## Installation |
8 | 21 |
|
9 | 22 | ```bash |
10 | | -uv run examples/agent.py |
| 23 | +uv add agentstack-sdk |
| 24 | +``` |
| 25 | + |
| 26 | +## Quickstart |
| 27 | + |
| 28 | +```python |
| 29 | +import os |
| 30 | + |
| 31 | +from a2a.types import ( |
| 32 | + Message, |
| 33 | +) |
| 34 | +from a2a.utils.message import get_message_text |
| 35 | +from agentstack_sdk.server import Server |
| 36 | +from agentstack_sdk.server.context import RunContext |
| 37 | +from agentstack_sdk.a2a.types import AgentMessage |
| 38 | + |
| 39 | +server = Server() |
| 40 | + |
| 41 | +@server.agent() |
| 42 | +async def example_agent(input: Message, context: RunContext): |
| 43 | + """Polite agent that greets the user""" |
| 44 | + hello_template: str = os.getenv("HELLO_TEMPLATE", "Ciao %s!") |
| 45 | + yield AgentMessage(text=hello_template % get_message_text(input)) |
| 46 | + |
| 47 | +def run(): |
| 48 | + try: |
| 49 | + server.run(host=os.getenv("HOST", "127.0.0.1"), port=int(os.getenv("PORT", 8000))) |
| 50 | + except KeyboardInterrupt: |
| 51 | + pass |
| 52 | + |
| 53 | + |
| 54 | +if __name__ == "__main__": |
| 55 | + run() |
11 | 56 | ``` |
12 | 57 |
|
13 | | -Connect to the agent using the CLI: |
| 58 | +Run the agent: |
14 | 59 |
|
15 | 60 | ```bash |
16 | | -uv run examples/cli.py |
| 61 | +uv run my_agent.py |
17 | 62 | ``` |
18 | 63 |
|
19 | | -## Plan |
20 | | - |
21 | | -- `agentstack_sdk` |
22 | | - - `a2a`: |
23 | | - - `extensions`: Shared definitions for A2A extensions |
24 | | - - `services`: Dependency injection extensions for external services |
25 | | - - `llm` |
26 | | - - `embedding` |
27 | | - - `docling` |
28 | | - - `file_store` |
29 | | - - `vector_store` |
30 | | - - `ui`: User interface extensions for Agent Stack UI |
31 | | - - `trajectory` |
32 | | - - `citations` |
33 | | - - `history`: store and allow requesting the full history of the context |
34 | | - - `server` |
35 | | - - `context_storage`: store data associated with context_id |
36 | | - - `wrapper`: conveniently build A2A agents -- opinionated on how tasks work, `yield`-semantics, autowired |
37 | | - services |
38 | | - - `services`: clients for external services |
39 | | - - `llm`: OpenAI-compatible chat LLM |
40 | | - - `embedding`: OpenAI-compatible embedding |
41 | | - - `text_extraction`: Docling-compatible text extraction |
42 | | - - `file_store`: S3-compatible file storage |
43 | | - - `vector_store`: some vector store? |
44 | | - - `client` |
45 | | - - ? |
| 64 | +## Available Extensions |
| 65 | + |
| 66 | +The SDK includes extension support for: |
| 67 | + |
| 68 | +- **Citations** - Source attribution (`CitationExtensionServer`, `CitationExtensionSpec`) |
| 69 | +- **Trajectory** - Agent decision logging (`TrajectoryExtensionServer`, `TrajectoryExtensionSpec`) |
| 70 | +- **Settings** - User-configurable agent parameters (`SettingsExtensionServer`, `SettingsExtensionSpec`) |
| 71 | +- **LLM Services** - Platform-managed language models (`LLMServiceExtensionServer`, `LLMServiceExtensionSpec`) |
| 72 | +- **Agent Details** - Metadata and UI enhancements (`AgentDetail`) |
| 73 | +- **And more** - See [Documentation](https://agentstack.beeai.dev/stable/agent-development/overview) |
| 74 | + |
| 75 | +Each extension provides both server-side handlers and A2A protocol specifications for seamless integration with Agent Stack's UI and infrastructure. |
| 76 | + |
| 77 | +## Resources |
| 78 | + |
| 79 | +- [Agent Stack Documentation](https://agentstack.beeai.dev) |
| 80 | +- [GitHub Repository](https://github.com/i-am-bee/agentstack) |
| 81 | +- [PyPI Package](https://pypi.org/project/agentstack-sdk/) |
| 82 | + |
| 83 | +## Contributing |
| 84 | + |
| 85 | +Contributions are welcome! Please see the [Contributing Guide](https://github.com/i-am-bee/agentstack/blob/main/CONTRIBUTING.md) for details. |
| 86 | + |
| 87 | +## Support |
| 88 | + |
| 89 | +- [GitHub Issues](https://github.com/i-am-bee/agentstack/issues) |
| 90 | +- [GitHub Discussions](https://github.com/i-am-bee/agentstack/discussions) |
| 91 | + |
| 92 | +--- |
| 93 | + |
| 94 | +Developed by contributors to the BeeAI project, this initiative is part of the [Linux Foundation AI & Data program](https://lfaidata.foundation/projects/). Its development follows open, collaborative, and community-driven practices. |
0 commit comments