Skip to content

Commit a90ce80

Browse files
Switch to OpenAI Responses API (#1981)
## Summary #1960 added support for OpenAI's chat completions api. This change switches to using [OpenAI's new Responses API](https://developers.openai.com/api/docs/guides/migrate-to-responses) instead. ### How to test locally or on Vercel ### How to test locally 1. Set env vars: `AI_PROVIDER=openai AI_API_KEY= AI_BASE_URL=<> AI_MODEL_NAME=<> AI_REQUEST_HEADERS={"X-Client-Id":"","X-Username":"", AI_ADDITIONAL_OPTIONS = {API_TYPE: "responses"}}` 3. Open Hyperdx's chart explorer and use the AI assistant chart builder - e.g. "show me error count by service in the last hour" 4. Confirm the assistant returns a valid chart config. ### References - Linear Issue: - Related PRs: Co-authored-by: peter-leonov-ch <209667683+peter-leonov-ch@users.noreply.github.com>
1 parent 7828a74 commit a90ce80

File tree

3 files changed

+12
-7
lines changed

3 files changed

+12
-7
lines changed

.changeset/sharp-eggs-appear.md

Lines changed: 5 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -0,0 +1,5 @@
1+
---
2+
'@hyperdx/api': patch
3+
---
4+
5+
Update OpenAI model configuration to use the new Responses API

packages/api/src/controllers/__tests__/ai.test.ts

Lines changed: 5 additions & 3 deletions
Original file line numberDiff line numberDiff line change
@@ -13,9 +13,11 @@ const mockCreateAnthropic = jest.fn(
1313
(_opts?: Record<string, unknown>) => mockAnthropicFactory,
1414
);
1515

16-
const mockOpenAIChatFactory = jest.fn((_model?: string) => mockOpenAIModel);
16+
const mockOpenAIResponsesFactory = jest.fn(
17+
(_model?: string) => mockOpenAIModel,
18+
);
1719
const mockCreateOpenAI = jest.fn((_opts?: Record<string, unknown>) => ({
18-
chat: mockOpenAIChatFactory,
20+
responses: mockOpenAIResponsesFactory,
1921
}));
2022

2123
jest.mock('@ai-sdk/anthropic', () => ({
@@ -191,7 +193,7 @@ describe('openai provider', () => {
191193
expect(mockCreateOpenAI).toHaveBeenCalledWith(
192194
expect.objectContaining({ apiKey: 'sk-test' }),
193195
);
194-
expect(mockOpenAIChatFactory).toHaveBeenCalledWith('gpt-4o');
196+
expect(mockOpenAIResponsesFactory).toHaveBeenCalledWith('gpt-4o');
195197
});
196198

197199
it('passes baseURL when AI_BASE_URL is set', () => {

packages/api/src/controllers/ai.ts

Lines changed: 2 additions & 4 deletions
Original file line numberDiff line numberDiff line change
@@ -368,9 +368,7 @@ function getAnthropicModel(): LanguageModel {
368368
}
369369

370370
/**
371-
* Configure OpenAI-compatible model.
372-
* Works with any OpenAI Chat Completions-compatible endpoint
373-
* (e.g. Azure OpenAI, OpenRouter, LiteLLM proxies).
371+
* Configure OpenAI-compatible model using the Responses API (/v1/responses).
374372
*/
375373
function getOpenAIModel(): LanguageModel {
376374
const apiKey = config.AI_API_KEY;
@@ -399,5 +397,5 @@ function getOpenAIModel(): LanguageModel {
399397
...(Object.keys(headers).length > 0 && { headers }),
400398
});
401399

402-
return openai.chat(config.AI_MODEL_NAME);
400+
return openai.responses(config.AI_MODEL_NAME);
403401
}

0 commit comments

Comments
 (0)