Skip to content
Open
Show file tree
Hide file tree
Changes from 5 commits
Commits
File filter

Filter by extension

Filter by extension


Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
51 changes: 51 additions & 0 deletions .github/copilot-instructions.md
Original file line number Diff line number Diff line change
@@ -0,0 +1,51 @@
# Copilot Instructions for any-llm

> Full guidelines are in [AGENTS.md](../AGENTS.md). This file surfaces the rules most critical for AI-assisted coding.

## Commands

```bash
# Setup
uv venv && source .venv/bin/activate && uv sync --all-extras -U

# Run all checks (preferred before committing)
uv run pre-commit run --all-files --verbose

# Tests
uv run pytest -v tests/unit
uv run pytest -v tests/integration -n auto # requires API keys
```

## Code Style (enforced by mypy + ruff)

- **Type hints required** on all new code; mypy runs in strict mode
- **`@override` decorator** from `typing_extensions` is required on every method that overrides a base class method — mypy enforces `explicit-override`. For static methods: `@staticmethod` first, then `@override`
- **Direct attribute access** (`obj.field`) preferred over `getattr(obj, "field")` for typed fields
- Line length: 120 chars (ruff)
- No decorative section-separator comments (`# ------` banners)

## Project Structure

```
src/any_llm/
providers/<provider>/ ← all provider-specific code goes here
types/ ← shared types
gateway/ ← optional FastAPI gateway
tests/
unit/ ← no API keys needed
integration/ ← skip when creds unavailable
gateway/
```

## Testing Rules

- **No class-based test grouping** — all tests are standalone functions
- Add happy path + error/raise path tests for every change (~85% coverage target)
- Integration tests must `pytest.skip(...)` when credentials are unavailable
- Optional-dependency imports (e.g. `mistralai`, `cohere`) go **inside** the test function, not at the top of the file

## Commits & PRs

- Conventional Commits: `feat(scope): ...`, `fix: ...`, `chore(deps): ...`, `tests: ...`
- PRs must complete the checklist in `.github/pull_request_template.md` and include AI-usage disclosure when applicable
- Never commit secrets — use env vars or a gitignored `.env`
79 changes: 79 additions & 0 deletions docs/src/content/docs/providers.md
Original file line number Diff line number Diff line change
Expand Up @@ -27,6 +27,7 @@ Provider source code is in [`src/any_llm/providers/`](https://github.com/mozilla
| [`bedrock`](https://aws.amazon.com/bedrock/) | AWS_BEARER_TOKEN_BEDROCK | AWS_ENDPOINT_URL_BEDROCK_RUNTIME | ❌ | ✅ | ✅ | ✅ | ✅ | ✅ | ✅ | ❌ |
| [`cerebras`](https://docs.cerebras.ai/) | CEREBRAS_API_KEY | CEREBRAS_API_BASE | ❌ | ✅ | ✅ | ✅ | ❌ | ❌ | ✅ | ❌ |
| [`cohere`](https://cohere.com/api) | COHERE_API_KEY | COHERE_BASE_URL | ❌ | ✅ | ✅ | ✅ | ❌ | ❌ | ✅ | ❌ |
| [`copilot_sdk`](https://github.com/github/copilot-sdk) | COPILOT_GITHUB_TOKEN | COPILOT_CLI_URL | ❌ | ✅ | ✅ | ✅ | ✅ | ❌ | ✅ | ❌ |
| [`databricks`](https://docs.databricks.com/) | DATABRICKS_TOKEN | DATABRICKS_HOST | ❌ | ✅ | ✅ | ✅ | ❌ | ✅ | ❌ | ❌ |
| [`deepseek`](https://platform.deepseek.com/) | DEEPSEEK_API_KEY | DEEPSEEK_API_BASE | ❌ | ✅ | ✅ | ✅ | ❌ | ❌ | ✅ | ❌ |
| [`fireworks`](https://fireworks.ai/api) | FIREWORKS_API_KEY | FIREWORKS_API_BASE | ✅ | ✅ | ✅ | ✅ | ✅ | ❌ | ✅ | ❌ |
Expand Down Expand Up @@ -60,3 +61,81 @@ Provider source code is in [`src/any_llm/providers/`](https://github.com/mozilla
| [`xai`](https://x.ai/) | XAI_API_KEY | XAI_API_BASE | ❌ | ✅ | ✅ | ✅ | ❌ | ❌ | ✅ | ❌ |
| [`zai`](https://docs.z.ai/guides/develop/python/introduction) | ZAI_API_KEY | ZAI_BASE_URL | ❌ | ✅ | ✅ | ✅ | ❌ | ❌ | ✅ | ❌ |
<!-- PROVIDER-TABLE-END -->

## Provider Notes

### `copilot_sdk` — GitHub Copilot SDK

The `copilot_sdk` provider communicates with GitHub Copilot models via the
[`github-copilot-sdk`](https://pypi.org/project/github-copilot-sdk/) Python package,
which bundles the Copilot CLI binary for your platform.

#### Installation

Install the platform-specific wheel:

```bash
pip install any-llm-sdk[copilot_sdk]
```

> **Note**: `github-copilot-sdk` ships separate wheels per OS and CPU architecture
> (e.g. `macosx_arm64`, `linux_x86_64`). `pip` selects the correct wheel automatically
> on supported platforms. If installation fails, check [PyPI](https://pypi.org/project/github-copilot-sdk/#files)
> for available platform tags.

#### Authentication

Two modes are supported, checked in order:

1. **Token mode** — set one of these environment variables:

```bash
export COPILOT_GITHUB_TOKEN="ghp_your_token"
# or
export GITHUB_TOKEN="ghp_your_token"
# or
export GH_TOKEN="ghp_your_token"
```

Alternatively, pass `api_key` directly to `AnyLLM.create()`.

2. **Logged-in CLI user** — if no token is set, the provider uses the credentials
from the local `gh` / `copilot` CLI session (run `gh auth login` first). No
environment variable is required in this mode.

#### Configuration

| Environment Variable | Purpose | Default |
| --- | --- | --- |
| `COPILOT_GITHUB_TOKEN` | GitHub token with Copilot access | — |
| `GITHUB_TOKEN` / `GH_TOKEN` | Fallback token sources | — |
| `COPILOT_CLI_URL` | Connect to an external CLI server instead of spawning one (e.g. `localhost:9000`) | — |
| `COPILOT_CLI_PATH` | Override the Copilot CLI binary path | PATH lookup |

#### Usage

```python
from any_llm import AnyLLM

# Token auth (or set COPILOT_GITHUB_TOKEN in environment)
llm = AnyLLM.create("copilot_sdk")

# List available models
models = llm.list_models()

# Completion with reasoning
response = llm.completion(
model="claude-sonnet-4-5",
messages=[{"role": "user", "content": "Explain async generators in Python."}],
reasoning_effort="high",
)
print(response.choices[0].message.content)

# Streaming
for chunk in llm.completion(
model="gpt-4o",
messages=[{"role": "user", "content": "Hello!"}],
stream=True,
):
print(chunk.choices[0].delta.content or "", end="", flush=True)
```
6 changes: 5 additions & 1 deletion pyproject.toml
Original file line number Diff line number Diff line change
Expand Up @@ -20,7 +20,7 @@ dependencies = [
[project.optional-dependencies]

all = [
"any-llm-sdk[mistral,anthropic,huggingface,gemini,vertexai,vertexaianthropic,cohere,cerebras,fireworks,groq,bedrock,azure,azureanthropic,azureopenai,watsonx,together,sambanova,ollama,moonshot,nebius,xai,databricks,deepseek,inception,openai,openrouter,portkey,lmstudio,llama,voyage,perplexity,platform,llamafile,llamacpp,sagemaker,gateway,zai,minimax,mzai,vllm]"
"any-llm-sdk[mistral,anthropic,huggingface,gemini,vertexai,vertexaianthropic,cohere,cerebras,fireworks,groq,bedrock,azure,azureanthropic,azureopenai,watsonx,together,sambanova,ollama,moonshot,nebius,xai,databricks,deepseek,inception,openai,openrouter,portkey,lmstudio,llama,voyage,perplexity,platform,llamafile,llamacpp,sagemaker,gateway,zai,minimax,mzai,vllm,copilot_sdk]"
]

platform = [
Expand All @@ -31,6 +31,10 @@ platform = [

perplexity = []

copilot_sdk = [
"github-copilot-sdk>=0.1.0",
]

mistral = [
"mistralai>=1.9.3",
]
Expand Down
1 change: 1 addition & 0 deletions src/any_llm/constants.py
Original file line number Diff line number Diff line change
Expand Up @@ -54,6 +54,7 @@ class LLMProvider(StrEnum):
PERPLEXITY = "perplexity"
MINIMAX = "minimax"
ZAI = "zai"
COPILOT_SDK = "copilot_sdk"
GATEWAY = "gateway"

@classmethod
Expand Down
7 changes: 7 additions & 0 deletions src/any_llm/providers/copilot_sdk/__init__.py
Original file line number Diff line number Diff line change
@@ -0,0 +1,7 @@
from .copilot_sdk import CopilotSdkProvider

# Factory alias: AnyLLM._create_provider() derives class names via
# provider_key.capitalize() + "Provider", which yields "Copilot_sdkProvider".
Copilot_sdkProvider = CopilotSdkProvider

__all__ = ["CopilotSdkProvider", "Copilot_sdkProvider"]
Loading