Skip to content

Commit 6244a29

Browse files
authored
feat(provider): add Groq provider (#33)
*add groq provider
1 parent 3f013e0 commit 6244a29

File tree

6 files changed

+509
-3
lines changed

6 files changed

+509
-3
lines changed

CLAUDE.md

Lines changed: 10 additions & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -83,21 +83,28 @@ Providers implement `ErrorConverter` using `errors.As` with SDK typed errors (no
8383
### Key Patterns
8484

8585
- **Configuration**: Functional options with validation
86-
- **Constants**: Extract ALL magic strings to named constants (including response format types like `json_object`)
86+
- **Constants**: Extract ALL magic strings to named constants (including response format types like `json_object`). Constants belong in production code files, not test files
8787
- **Streaming**: Break monolithic handlers into focused methods (see `anthropic/anthropic.go`)
8888
- **Streaming Safety**: Always use `select` with `ctx.Done()` when sending to channels in goroutines to prevent blocking forever if consumer abandons
8989
- **ID Generation**: Use `crypto/rand`, not package-level mutable state
9090
- **Error Conversion**: Use `errors.As` with SDK typed errors; avoid string matching when possible
9191
- **Input Validation**: Validate required fields (Model non-empty, Messages has entries) before API calls
9292
- **Unknown Values**: Never silently convert unknown enum values (e.g., unknown role → user); error or log warning instead
9393
- **Struct Field Order**: Order struct fields A-Z (don't optimize for padding)
94+
- **Slice Cloning**: Prefer `slices.Clone` over manual `make` + `copy`
95+
- **Slice Capacity**: Use `make([]T, 0, len(source))` when building slices from known-size sources
96+
- **ContentString()**: Use `msg.ContentString()` instead of `msg.Content.(string)` type assertions
97+
- **Switch Completeness**: Switch statements should have a `default` case (error or explicit fallback)
98+
- **Variable Naming**: Don't shadow imported package names (e.g., use `imgURL` not `url` when `net/url` is imported)
99+
- **Error Messages**: Don't double-print provider name when the base error already includes it
94100

95101
### OpenAI-Compatible Providers
96102

97103
For providers that expose OpenAI-compatible APIs but don't have their own Go SDK (Llamafile, vLLM, LM Studio, etc.):
98104
- Use the compatible provider in `providers/openai/compatible.go`
99105
- Import: `"github.com/mozilla-ai/any-llm-go/providers/openai"`
100106
- Create thin wrapper that calls `openai.NewCompatible()` with provider-specific `CompatibleConfig`
107+
- Set ALL `CompatibleConfig` fields explicitly, including empty values (e.g., `BaseURLEnvVar: ""`, `DefaultAPIKey: ""`)
101108
- Add interface assertions in the wrapper package
102109

103110
### Testing
@@ -110,6 +117,8 @@ For providers that expose OpenAI-compatible APIs but don't have their own Go SDK
110117
- Skip integration tests gracefully when provider unavailable
111118
- Use constants (e.g., `objectChatCompletion`) instead of string literals in test assertions
112119
- Base packages need their own test suites, not just wrapper tests
120+
- No redundant assertions (e.g., `require.NotEmpty` already checks len > 0, don't follow with `require.Greater`)
121+
- Add a comment when intentionally discarding return values (e.g., `_, _ = fn()`)
113122

114123
## Adding a New Provider
115124

README.md

Lines changed: 1 addition & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -281,6 +281,7 @@ for _, model := range models.Data {
281281
| Anthropic ||||||
282282
| DeepSeek ||||||
283283
| Gemini ||||||
284+
| Groq ||||||
284285
| Llamafile ||||||
285286
| Mistral ||||||
286287
| Ollama ||||||

docs/providers.md

Lines changed: 37 additions & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -9,6 +9,7 @@ any-llm-go supports multiple LLM providers through a unified interface. Each pro
99
| [Anthropic](#anthropic) | `anthropic` |||||||
1010
| [DeepSeek](#deepseek) | `deepseek` |||||||
1111
| [Gemini](#gemini) | `gemini` |||||||
12+
| [Groq](#groq) | `groq` |||||||
1213
| [Llamafile](#llamafile) | `llamafile` |||||||
1314
| [Mistral](#mistral) | `mistral` |||||||
1415
| [Ollama](#ollama) | `ollama` |||||||
@@ -148,6 +149,42 @@ if response.Choices[0].Message.Reasoning != nil {
148149
}
149150
```
150151

152+
### Groq
153+
154+
Groq provides fast inference through their cloud API. It exposes an OpenAI-compatible API.
155+
156+
```go
157+
import (
158+
anyllm "github.com/mozilla-ai/any-llm-go"
159+
"github.com/mozilla-ai/any-llm-go/providers/groq"
160+
)
161+
162+
// Using environment variable (GROQ_API_KEY).
163+
provider, err := groq.New()
164+
165+
// Or with explicit API key.
166+
provider, err := groq.New(anyllm.WithAPIKey("gsk_..."))
167+
```
168+
169+
**Environment Variable:** `GROQ_API_KEY`
170+
171+
**Popular Models:**
172+
- `llama-3.1-8b-instant` - Fast and cost-effective
173+
- `llama-3.3-70b-versatile` - More capable model
174+
- `mixtral-8x7b-32768` - Mixtral with 32k context
175+
176+
**Completion:**
177+
178+
```go
179+
provider, _ := groq.New()
180+
resp, err := provider.Completion(ctx, anyllm.CompletionParams{
181+
Model: "llama-3.1-8b-instant",
182+
Messages: []anyllm.Message{
183+
{Role: anyllm.RoleUser, Content: "Hello!"},
184+
},
185+
})
186+
```
187+
151188
### Mistral
152189

153190
```go
@@ -369,7 +406,6 @@ The following providers are planned for future releases:
369406

370407
| Provider | Status |
371408
|--------------|---------------------------------------------------|
372-
| Groq | Planned |
373409
| Cohere | Planned |
374410
| Together AI | Planned |
375411
| AWS Bedrock | Planned |

internal/testutil/fixtures.go

Lines changed: 1 addition & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -18,7 +18,7 @@ var ProviderModelMap = map[string]string{
1818
"mistral": "mistral-small-latest",
1919
"gemini": "gemini-2.5-flash",
2020
"cohere": "command-r",
21-
"groq": "llama-3.1-8b-instant",
21+
"groq": "llama-3.3-70b-versatile",
2222
"ollama": "llama3.2",
2323
"llamafile": "LLaMA_CPP",
2424
"together": "meta-llama/Meta-Llama-3.1-8B-Instruct-Turbo",

providers/groq/groq.go

Lines changed: 68 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -0,0 +1,68 @@
1+
// Package groq provides a Groq provider implementation for any-llm.
2+
// Groq exposes an OpenAI-compatible API optimized for fast inference.
3+
package groq
4+
5+
import (
6+
"github.com/mozilla-ai/any-llm-go/config"
7+
"github.com/mozilla-ai/any-llm-go/providers"
8+
"github.com/mozilla-ai/any-llm-go/providers/openai"
9+
)
10+
11+
// Provider configuration constants.
12+
const (
13+
defaultBaseURL = "https://api.groq.com/openai/v1"
14+
envAPIKey = "GROQ_API_KEY"
15+
providerName = "groq"
16+
)
17+
18+
// Object type constants for API responses.
19+
const (
20+
objectChatCompletion = "chat.completion"
21+
objectChatCompletionChunk = "chat.completion.chunk"
22+
objectList = "list"
23+
)
24+
25+
// Ensure Provider implements the required interfaces.
26+
var (
27+
_ providers.CapabilityProvider = (*Provider)(nil)
28+
_ providers.ErrorConverter = (*Provider)(nil)
29+
_ providers.ModelLister = (*Provider)(nil)
30+
_ providers.Provider = (*Provider)(nil)
31+
)
32+
33+
// Provider implements the providers.Provider interface for Groq.
34+
// It embeds openai.CompatibleProvider since Groq exposes an OpenAI-compatible API.
35+
type Provider struct {
36+
*openai.CompatibleProvider
37+
}
38+
39+
// New creates a new Groq provider.
40+
func New(opts ...config.Option) (*Provider, error) {
41+
base, err := openai.NewCompatible(openai.CompatibleConfig{
42+
APIKeyEnvVar: envAPIKey,
43+
BaseURLEnvVar: "",
44+
Capabilities: groqCapabilities(),
45+
DefaultAPIKey: "",
46+
DefaultBaseURL: defaultBaseURL,
47+
Name: providerName,
48+
RequireAPIKey: true,
49+
}, opts...)
50+
if err != nil {
51+
return nil, err
52+
}
53+
54+
return &Provider{CompatibleProvider: base}, nil
55+
}
56+
57+
// groqCapabilities returns the capabilities for the Groq provider.
58+
func groqCapabilities() providers.Capabilities {
59+
return providers.Capabilities{
60+
Completion: true,
61+
CompletionImage: false, // Groq doesn't support image inputs.
62+
CompletionPDF: false,
63+
CompletionReasoning: false, // Groq doesn't support reasoning parameters.
64+
CompletionStreaming: true,
65+
Embedding: false, // Groq doesn't host embedding models.
66+
ListModels: true,
67+
}
68+
}

0 commit comments

Comments
 (0)