-
Notifications
You must be signed in to change notification settings - Fork 2
Open
Description
Summary
Implement a DeepSeek provider (pkg/provider/deepseek). DeepSeek's API is OpenAI-compatible, making this a straightforward implementation.
Requirements
- Implement
llm.Clientinterface for the DeepSeek API (https://api.deepseek.com) - API key via
DEEPSEEK_API_KEYenvironment variable - Support streaming and non-streaming chat completions
- Support tool/function calling
- Support thinking/reasoning (DeepSeek-R1 uses
<think>blocks) - Register provider in the server command alongside existing providers
Models
- DeepSeek-V3 (chat)
- DeepSeek-R1 (reasoning)
- DeepSeek-Coder (code-focused)
Notes
- The API follows OpenAI's chat completions format, so much of the implementation can mirror or share code with a future OpenAI provider
- There is existing code in
etc/_old/deepseek/that may be useful as reference - DeepSeek-R1 reasoning output appears in the
reasoning_contentfield of the response, which needs to be mapped to the thinking content block model
Motivation
Quick win — OpenAI-compatible API means minimal new code. DeepSeek-R1 is a popular reasoning model and adds another thinking-capable provider alongside Gemini and Anthropic
- Embedding models (text-embedding-3-small, text-embedding-3-large)
Motivation
OpenAI is a major LLM provider and its multi-modal output capabilities (images, audio) will drive the content model to support rich responses across all providers, improving the overall architecture.
Reactions are currently unavailable
Metadata
Metadata
Assignees
Labels
No labels