Skip to content

Add DeepSeek provider (OpenAI-compatible API) #23

@djthorpe

Description

@djthorpe

Summary

Implement a DeepSeek provider (pkg/provider/deepseek). DeepSeek's API is OpenAI-compatible, making this a straightforward implementation.

Requirements

  • Implement llm.Client interface for the DeepSeek API (https://api.deepseek.com)
  • API key via DEEPSEEK_API_KEY environment variable
  • Support streaming and non-streaming chat completions
  • Support tool/function calling
  • Support thinking/reasoning (DeepSeek-R1 uses <think> blocks)
  • Register provider in the server command alongside existing providers

Models

  • DeepSeek-V3 (chat)
  • DeepSeek-R1 (reasoning)
  • DeepSeek-Coder (code-focused)

Notes

  • The API follows OpenAI's chat completions format, so much of the implementation can mirror or share code with a future OpenAI provider
  • There is existing code in etc/_old/deepseek/ that may be useful as reference
  • DeepSeek-R1 reasoning output appears in the reasoning_content field of the response, which needs to be mapped to the thinking content block model

Motivation

Quick win — OpenAI-compatible API means minimal new code. DeepSeek-R1 is a popular reasoning model and adds another thinking-capable provider alongside Gemini and Anthropic

  • Embedding models (text-embedding-3-small, text-embedding-3-large)

Motivation

OpenAI is a major LLM provider and its multi-modal output capabilities (images, audio) will drive the content model to support rich responses across all providers, improving the overall architecture.

Metadata

Metadata

Assignees

Labels

No labels
No labels

Type

No type

Projects

No projects

Milestone

No milestone

Relationships

None yet

Development

No branches or pull requests

Issue actions