Skip to content

Model settings when providing a prompt ID #562

@aavetis

Description

@aavetis

Describe the bug

I am providing a prompt object, that has everything defined on the openai side (prompt, tools, model settings, reasoning settings, etc):

Image

Shouldn't that mean in my agent call, I don't need to define anything about the model? The version, the reasoning settings, etc?

However, if I just send this:

    const agent = new Agent({
      name: "TestAgent",
      // model: MODEL_DEF,
      prompt: {
        promptId: pmpt_123123,
        variables: {
          userid: userId,
        },
      },

      tools: [
        someTool,
        webSearchTool(),
      ],
      outputType: DraftOutput,
      // modelSettings: {
        //toolChoice: "auto",
        // providerData: {
          // reasoning: { effort: "low" },
          // text: { verbosity: "low" },
        // },
      // },
    });

Output:

Unsupported parameter: 'reasoning.effort' is not supported with this model.

I need to uncomment the commented lines for it to work. From what I understand is in the prompt object, I should be able to send it like this. But maybe I'm not getting it?

In fact, it completely ignores what model I provide. I just need to provide some gpt-5 class model so it knows I can use the reasoning effort params.

Debug information

"node_modules/@openai/agents": {
      "version": "0.1.9",
...
        "@openai/agents-core": "0.1.8",
        "@openai/agents-openai": "0.1.9",

Metadata

Metadata

Assignees

No one assigned

    Type

    No type

    Projects

    No projects

    Milestone

    No milestone

    Relationships

    None yet

    Development

    No branches or pull requests

    Issue actions