Skip to content

feat: [#916] Implement AI module with provider resolver and OpenAI driver#1421

Open
hwbrzzl wants to merge 2 commits intomasterfrom
bowen/#916
Open

feat: [#916] Implement AI module with provider resolver and OpenAI driver#1421
hwbrzzl wants to merge 2 commits intomasterfrom
bowen/#916

Conversation

@hwbrzzl
Copy link
Contributor

@hwbrzzl hwbrzzl commented Mar 25, 2026

Summary

  • Conversation.Prompt no longer accepts context.Context; context is bound once via AI.WithContext(ctx) and reused across all operations
  • A ProviderResolver lazily initialises and caches AI providers from config, with Via accepting a live Provider instance or a func() (Provider, error) factory
  • Built-in OpenAI provider translates AgentPrompt into chat completion requests with a configurable API key, base URL, and default text model (gpt-5.4)

Closes #916

Why

The AI module now has a concrete implementation behind its interfaces. Moving context out of Conversation.Prompt and into AI.WithContext(ctx) keeps the per-prompt call clean: callers bind a context once and every subsequent Prompt and Agent call carries it automatically without repetition.

The new ProviderResolver reads from the ai config key, initialises each driver on first use, and caches it, so the construction cost is paid only once. The Via field accepts either a live Provider for direct injection (e.g. in tests) or a func() (Provider, error) for lazy construction, keeping the bootstrap path flexible.

app := NewApplication(ctx, contractsai.Config{
    Default: "openai",
    Providers: map[string]contractsai.ProviderConfig{
        "openai": {Via: provider},
    },
})

conv, _ := app.Agent(agent, WithModel("gpt-5.4"))
resp, _ := conv.Prompt("What is 2 + 2?")
fmt.Println(resp.Text()) // "Four"

// History is preserved across turns automatically
resp, _ = conv.Prompt("And multiply that by 3?")
fmt.Println(conv.Messages())
// [{user "What is 2 + 2?"} {assistant "Four"} {user "And multiply that by 3?"} ...]

The OpenAI provider maps Agent.Instructions() to the OpenAI system message, Agent.Messages() to the preceding conversation history, and the current Input to the final user message, then calls Chat.Completions.New via the openai-go client. When the provider config omits a default text model, it falls back to gpt-5.4.

image

@hwbrzzl hwbrzzl requested a review from a team as a code owner March 25, 2026 09:30
@codecov
Copy link

codecov bot commented Mar 25, 2026

Codecov Report

❌ Patch coverage is 99.18033% with 1 line in your changes missing coverage. Please review.
✅ Project coverage is 68.20%. Comparing base (c0f099a) to head (fafec5a).
⚠️ Report is 6 commits behind head on master.

Files with missing lines Patch % Lines
ai/conversation.go 95.65% 1 Missing ⚠️
Additional details and impacted files
@@            Coverage Diff             @@
##           master    #1421      +/-   ##
==========================================
+ Coverage   65.94%   68.20%   +2.25%     
==========================================
  Files         353      357       +4     
  Lines       26083    27630    +1547     
==========================================
+ Hits        17201    18844    +1643     
+ Misses       8141     7937     -204     
- Partials      741      849     +108     

☔ View full report in Codecov by Sentry.
📢 Have feedback on the report? Share it here.

🚀 New features to boost your workflow:
  • ❄️ Test Analytics: Detect flaky tests, report on failures, and find test suite problems.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

None yet

Projects

None yet

Development

Successfully merging this pull request may close these issues.

1 participant