Skip to content

[Bug]: GPT5 Doesn't Support max_token, instead use max_completion_tokens instead #510

@zakriyaalisabir

Description

@zakriyaalisabir

Opencommit Version

3.2.10

Node Version

v22.14.0

NPM Version

9.8.1

What OS are you seeing the problem on?

Mac

What happened?

Got this error when trying to use model = gpt-5-nano of provider = openai

400 Unsupported parameter: 'max_tokens' is not supported with this model. Use 'max_completion_tokens' instead.

Expected Behavior

Should work fine.

Current Behavior

open-commit │ ◇ 1 staged files: openapi_config_dev.tf │ ◇ ✖ Failed to generate the commit message BadRequestError3: 400 Unsupported parameter: 'max_tokens' is not supported with this model. Use 'max_completion_tokens' instead. at _APIError.generate (/Users/zakriyaalisabir/.nvm/versions/node/v22.14.0/lib/node_modules/opencommit/out/cli.cjs:63105:14) at OpenAI.makeStatusError (/Users/zakriyaalisabir/.nvm/versions/node/v22.14.0/lib/node_modules/opencommit/out/cli.cjs:62641:22) at OpenAI.makeRequest (/Users/zakriyaalisabir/.nvm/versions/node/v22.14.0/lib/node_modules/opencommit/out/cli.cjs:62684:24) at process.processTicksAndRejections (node:internal/process/task_queues:105:5) at async OpenAiEngine.generateCommitMessage (/Users/zakriyaalisabir/.nvm/versions/node/v22.14.0/lib/node_modules/opencommit/out/cli.cjs:66539:28) at async generateCommitMessageByDiff (/Users/zakriyaalisabir/.nvm/versions/node/v22.14.0/lib/node_modules/opencommit/out/cli.cjs:67373:27) at async generateCommitMessageFromGitDiff (/Users/zakriyaalisabir/.nvm/versions/node/v22.14.0/lib/node_modules/opencommit/out/cli.cjs:67595:25) at async trytm (/Users/zakriyaalisabir/.nvm/versions/node/v22.14.0/lib/node_modules/opencommit/out/cli.cjs:67563:18) at async commit (/Users/zakriyaalisabir/.nvm/versions/node/v22.14.0/lib/node_modules/opencommit/out/cli.cjs:67768:35) { status: 400, headers: { 'access-control-expose-headers': 'X-Request-ID', 'alt-svc': 'h3=":443"; ma=86400', 'cf-cache-status': 'DYNAMIC', 'cf-ray': '96bd9448efb899a6-BKK', connection: 'keep-alive', 'content-length': '245', 'content-type': 'application/json', date: 'Fri, 08 Aug 2025 08:17:46 GMT', 'openai-organization': 'personal-jtddxj', 'openai-processing-ms': '30', 'openai-project': 'proj_eCY4aMVNwQMbxmgQAIUjP8px', 'openai-version': '2020-10-01', server: 'cloudflare', 'set-cookie': '__cf_bm=RRjE3GkDUk5Jies9AZ3oJJCJ5SGHLA1WrIEG19ntHiE-1754641066-1.0.1.1-wJXjZMtnCAbobQxcjTahng_mKK6hiZ1NMxmAJFLM3q7F9WYCUtZPMclstOHoVFKsf1zNTfcXpZ5Rd3dZiKULdrZ.9UP9AbZ16DpRFJPgGjA; path=/; expires=Fri, 08-Aug-25 08:47:46 GMT; domain=.api.openai.com; HttpOnly; Secure; SameSite=None, _cfuvid=dFVNKvQUJJljvr2GLkrs.OQm5lNDFjvXUoDyvP5Tpjg-1754641066665-0.0.1.1-604800000; path=/; domain=.api.openai.com; HttpOnly; Secure; SameSite=None', 'strict-transport-security': 'max-age=31536000; includeSubDomains; preload', 'x-content-type-options': 'nosniff', 'x-envoy-upstream-service-time': '54', 'x-ratelimit-limit-requests': '500', 'x-ratelimit-limit-tokens': '200000', 'x-ratelimit-remaining-requests': '499', 'x-ratelimit-remaining-tokens': '198928', 'x-ratelimit-reset-requests': '120ms', 'x-ratelimit-reset-tokens': '321ms', 'x-request-id': 'req_8e2c19296b32418382b4154d90fdc647' }, request_id: 'req_8e2c19296b32418382b4154d90fdc647', error: { message: "Unsupported parameter: 'max_tokens' is not supported with this model. Use 'max_completion_tokens' instead.", type: 'invalid_request_error', param: 'max_tokens', code: 'unsupported_parameter' }, code: 'unsupported_parameter', param: 'max_tokens', type: 'invalid_request_error' } │ └ ✖ 400 Unsupported parameter: 'max_tokens' is not supported with this model. Use 'max_completion_tokens' instead.

Possible Solution

No response

Steps to Reproduce

No response

Relevant log output

Metadata

Metadata

Assignees

No one assigned

    Labels

    bugSomething isn't working

    Projects

    No projects

    Milestone

    No milestone

    Relationships

    None yet

    Development

    No branches or pull requests

    Issue actions