Skip to content

roocode with groq gives max_tokens error for kimi k2Β #5739

@hamaadtahiir

Description

@hamaadtahiir

App Version

3.23.11

API Provider

Groq

Model Used

moonshotai/kimi-k2-instruct

Roo Code Task Links (Optional)

Image

Roocode assumes max token for the kimi k2 through groq to be same as its context window, thus giving this error:

400 max_tokens must be less than or equal to 16384, the maximum value for max_tokens is less than the context_window for this model

πŸ” Steps to Reproduce

Just try kimi k2 model through groq

πŸ’₯ Outcome Summary

Expected it to work but it didnt

πŸ“„ Relevant Logs or Errors (Optional)

Metadata

Metadata

Assignees

No one assigned

    Labels

    bugSomething isn't working

    Type

    No type

    Projects

    Status

    Done

    Milestone

    No milestone

    Relationships

    None yet

    Development

    No branches or pull requests

    Issue actions