Skip to content
Merged
Changes from 1 commit
Commits
File filter

Filter by extension

Filter by extension

Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
143 changes: 109 additions & 34 deletions apps/agentstack-sdk-py/README.md
Original file line number Diff line number Diff line change
@@ -1,45 +1,120 @@
# Agent Stack SDK
# Agent Stack Server SDK

## Examples
Python SDK for deploying agents to Agent Stack infrastructure.

The examples connect to the Agent Stack for LLM inteference.
[![PyPI version](https://img.shields.io/pypi/v/agentstack-sdk.svg?style=plastic)](https://pypi.org/project/agentstack-sdk/)
[![License](https://img.shields.io/badge/License-Apache_2.0-blue.svg?style=plastic)](https://opensource.org/licenses/Apache-2.0)
[![LF AI & Data](https://img.shields.io/badge/LF%20AI%20%26%20Data-0072C6?style=plastic&logo=linuxfoundation&logoColor=white)](https://lfaidata.foundation/projects/)

Run using:
## Overview

The `agentstack-sdk` provides Python utilities for wrapping agents built with any framework (LangChain, CrewAI, BeeAI Framework, etc.) for deployment on Agent Stack. It handles the A2A (Agent-to-Agent) protocol implementation, platform service integration, and runtime requirements so you can focus on agent logic.

## Key Features

- **Framework-Agnostic Deployment** - Wrap agents from any framework for Agent Stack deployment
- **A2A Protocol Support** - Automatic handling of Agent-to-Agent communication
- **Platform Service Integration** - Connect to Agent Stack's managed LLM, embedding, file storage, and vector store services
- **Agent Wrapper** - Opinionated utilities with `yield` semantics and autowired services
- **Context Storage** - Manage data associated with conversation contexts

## Installation

```bash
uv run examples/agent.py
uv add agentstack-sdk
```

Connect to the agent using the CLI:
## Quickstart

```python
import os
from a2a.types import AgentSkill, Message
from agentstack_sdk.server import Server
from agentstack_sdk.server.context import RunContext
from agentstack_sdk.server.store.platform_context_store import PlatformContextStore

# Initialize server
server = Server()

# Define your agent
@server.agent(
name="My Agent",
skills=[
AgentSkill(
id="my-agent-skill",
name="My Agent",
description="Agent description here",
tags=["Chat"],
examples=["Example query 1", "Example query 2"]
)
],
)
async def my_agent(
input: Message,
context: RunContext,
):
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

medium

The my_agent function is an async generator, but it lacks a return type hint. Adding a type hint makes the function's contract explicit, improving readability and enabling better static analysis. You'll need to import AsyncGenerator from collections.abc at the top of the file.

Suggested change
async def my_agent(
input: Message,
context: RunContext,
):
async def my_agent(
input: Message,
context: RunContext,
) -> "AsyncGenerator[str, None]":

"""Your agent logic here"""

# Store incoming message
await context.store(input)

# Extract user message
user_msg = "".join(
part.root.text for part in input.parts
if part.root.kind == "text"
)

# Process and yield response
response_text = f"You said: {user_msg}"
yield response_text

# Store response in context
from agentstack_sdk.a2a.types import AgentMessage
await context.store(AgentMessage(text=response_text))
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

medium

According to PEP 8, imports should be at the top of the file. The import for AgentMessage should be moved to the top of the file with the other imports to improve readability and make dependencies clear. Please add from agentstack_sdk.a2a.types import AgentMessage at the top of the code block.

Suggested change
from agentstack_sdk.a2a.types import AgentMessage
await context.store(AgentMessage(text=response_text))
await context.store(AgentMessage(text=response_text))


# Run the server
if __name__ == "__main__":
server.run(
host=os.getenv("HOST", "127.0.0.1"),
port=int(os.getenv("PORT", 8000)),
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

high

The os.getenv function's default value must be a string. Providing an integer 8000 will cause a TypeError if the PORT environment variable is not set. The default value should be a string, "8000", which is then converted to an integer.

Suggested change
port=int(os.getenv("PORT", 8000)),
port=int(os.getenv("PORT", "8000")),

context_store=PlatformContextStore()
)
```

Run the agent:

```bash
uv run examples/cli.py
python my_agent.py
```

## Plan

- `agentstack_sdk`
- `a2a`:
- `extensions`: Shared definitions for A2A extensions
- `services`: Dependency injection extensions for external services
- `llm`
- `embedding`
- `docling`
- `file_store`
- `vector_store`
- `ui`: User interface extensions for Agent Stack UI
- `trajectory`
- `citations`
- `history`: store and allow requesting the full history of the context
- `server`
- `context_storage`: store data associated with context_id
- `wrapper`: conveniently build A2A agents -- opinionated on how tasks work, `yield`-semantics, autowired
services
- `services`: clients for external services
- `llm`: OpenAI-compatible chat LLM
- `embedding`: OpenAI-compatible embedding
- `text_extraction`: Docling-compatible text extraction
- `file_store`: S3-compatible file storage
- `vector_store`: some vector store?
- `client`
- ?
## Available Extensions

The SDK includes extension support for:

- **Citations** - Source attribution (`CitationExtensionServer`, `CitationExtensionSpec`)
- **Trajectory** - Agent decision logging (`TrajectoryExtensionServer`, `TrajectoryExtensionSpec`)
- **Settings** - User-configurable agent parameters (`SettingsExtensionServer`, `SettingsExtensionSpec`)
- **LLM Services** - Platform-managed language models (`LLMServiceExtensionServer`, `LLMServiceExtensionSpec`)
- **Agent Details** - Metadata and UI enhancements (`AgentDetail`)
- **And more** - See [Documentation](https://agentstack.beeai.dev/stable/agent-development/overview).

Each extension provides both server-side handlers and A2A protocol specifications for seamless integration with Agent Stack's UI and infrastructure.

## Resources

- [Agent Stack Documentation](https://agentstack.beeai.dev)
- [GitHub Repository](https://github.com/i-am-bee/agentstack)
- [PyPI Package](https://pypi.org/project/agentstack-sdk/)

## Contributing

Contributions are welcome! Please see the [Contributing Guide](https://github.com/i-am-bee/agentstack/blob/main/CONTRIBUTING.md) for details.

## Support

- [GitHub Issues](https://github.com/i-am-bee/agentstack/issues)
- [GitHub Discussions](https://github.com/i-am-bee/agentstack/discussions)

---

Developed by contributors to the BeeAI project, this initiative is part of the [Linux Foundation AI & Data program](https://lfaidata.foundation/projects/). Its development follows open, collaborative, and community-driven practices.
Loading