Skip to content
Draft
Show file tree
Hide file tree
Changes from 2 commits
Commits
File filter

Filter by extension

Filter by extension

Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
5 changes: 5 additions & 0 deletions .changeset/gold-vans-complain.md
Original file line number Diff line number Diff line change
@@ -0,0 +1,5 @@
---
"@openai/agents-openai": patch
---

fix: resolve #425 duplicate item error when using conversationId with tools
31 changes: 29 additions & 2 deletions packages/agents-openai/src/openaiResponsesModel.ts
Original file line number Diff line number Diff line change
Expand Up @@ -426,6 +426,7 @@ function getPrompt(prompt: ModelRequest['prompt']):

function getInputItems(
input: ModelRequest['input'],
conversationId?: string,
): OpenAI.Responses.ResponseInputItem[] {
if (typeof input === 'string') {
return [
Expand All @@ -436,7 +437,33 @@ function getInputItems(
];
}

return input.map((item) => {
// When using conversationId, the OpenAI Responses API automatically retrieves
// the conversation history. To avoid duplicate items with the same IDs,
// we need to filter out items that would already be present in the conversation.
// We keep only the items from the current turn (typically the last few items).
let filteredInput = input;
if (conversationId) {
// Find the last user message to identify the start of the current turn
let lastUserMessageIndex = -1;
for (let i = input.length - 1; i >= 0; i--) {
const item = input[i];
if (isMessageItem(item) && item.role === 'user') {
lastUserMessageIndex = i;
break;
}
}

// If we found a user message, only include items from that point onwards
// This represents the current turn's conversation
if (lastUserMessageIndex >= 0) {
filteredInput = input.slice(lastUserMessageIndex);

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

[P1] Filtering from last user message still resends duplicated tool items

The new filtering logic slices the input array starting at the last user message, but in a tool-driven turn the items that cause the duplicate-ID error are typically the assistant’s tool call and other model-generated items that follow that user message. During the follow‑up request that posts tool results, getTurnInput still includes the original user message and the model’s tool call (with the same IDs as already stored in the conversation). Because those items fall after the last user message, this function keeps them and they are resent alongside the tool output, so the API still receives duplicates and fails. To avoid the error you need to send only the items generated since the last request (e.g., by tracking the last response ID or filtering by new item IDs), not everything after the last user message.

Useful? React with 👍 / 👎.

Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

I think this is a valid point.

Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

@codex can you suggest alternative approaches here?

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

@codex can you suggest alternative approaches here? (path=packages/agents-openai/src/openaiResponsesModel.ts line=459 side=RIGHT)

For now, I can only help with PRs you've created.

Copy link
Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Hi @seratch would you review the latest commit?
• Added heuristics in getInputItems to detect items already persisted in the conversation
when conversationId is set, slicing the input to send only newly generated turn data and
avoid duplicate IDs.
• Extended unit coverage to verify that stored items are skipped while fresh inputs continue
to flow when no persisted metadata exists.

} else {
// If no user message found, include all items (fallback)
filteredInput = input;
}
}

return filteredInput.map((item) => {
if (isMessageItem(item)) {
return getMessageItem(item);
}
Expand Down Expand Up @@ -847,7 +874,7 @@ export class OpenAIResponsesModel implements Model {
): Promise<
Stream<OpenAI.Responses.ResponseStreamEvent> | OpenAI.Responses.Response
> {
const input = getInputItems(request.input);
const input = getInputItems(request.input, request.conversationId);
const { tools, include } = getTools(request.tools, request.handoffs);
const toolChoice = getToolChoice(request.modelSettings.toolChoice);
const { text, ...restOfProviderData } =
Expand Down