Skip to content

Conversation

@CsBigDataHub
Copy link
Contributor

  • Add version parameter to openai-chat provider config to support different API versions
  • Version 1: Uses max_tokens parameter (for Mistral AI and older OpenAI-compatible endpoints)
  • Version 2 (default): Uses max_completion_tokens parameter (for OpenAI and modern endpoints)
  • Fixes Mistral AI 422 errors by allowing configuration of appropriate token parameter
  • Supports both provider-level and model-level version configuration with provider taking precedence
  • Maintains full backward compatibility by defaulting to version 2
  • Automatically removes conflicting token parameters based on version
  • Prevents sending version parameter to external APIs
  • Includes comprehensive integration tests
  • Updates documentation with examples and schema
  • Adds CHANGELOG entry under Unreleased section

…llm_api.clj

* src/eca/llm_api.clj (api-defn): Add :provider-config key to handle dynamic token limits and version-specific behavior.
* docs/configuration.md (config): Add version parameter with API version explanation for openai-chat providers
* test/eca/llm_providers/openai_chat_test.clj (test-helper, tests): Add version parameter handling for max_tokens vs max_completion_tokens API differences
…lized-input

* integration-test/llm_mock/openai_chat.clj: add token params to normalized-body
@CsBigDataHub
Copy link
Contributor Author

@ericdallo please do NOT merge, I am having tool call issues

LLM response status: 400 body: {"object":"error","message":"Tool call id was 7878-452755744 but must be a-z, A-Z, 0-9, with a length of 9.","type":"invalid_function_call","param":null,"code":"3280"}

@ericdallo
Copy link
Member

@CsBigDataHub let me think a little bit, I don't want to have different functions, apis or complex things just to one or other param more or removed, I want user to be able to customize that like we did for responses adn anthropid, we may manage to do the same, let me take a look

@ericdallo ericdallo closed this in afe9899 Dec 13, 2025
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

None yet

Projects

None yet

Development

Successfully merging this pull request may close these issues.

3 participants