Skip to content
This repository was archived by the owner on Jul 22, 2025. It is now read-only.

Conversation

@SamSaffron
Copy link
Member

@SamSaffron SamSaffron commented Nov 19, 2024

Adds support for mistral models (pixtral and mistral large now have presets)

Also corrects token accounting in AWS bedrock models

@SamSaffron SamSaffron changed the title FIX: allow the correct number of tokens for AWS models FEATURE: Add support for Mistral models Nov 19, 2024
module DiscourseAi
module Completions
module Endpoints
class Mistral < OpenAi
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Should the provider_id be overridden? As-is, this currently returns the one for OpenAi.

Copy link
Member Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

oh sure ... good catch will fix

Copy link
Contributor

@tyb-talks tyb-talks left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Minor question on provider_id for the endpoint, but otherwise looks ok.

Copy link
Contributor

@tyb-talks tyb-talks left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

I just spotted something I missed earlier in the OpenAi endpoint:

        def prepare_payload(prompt, model_params, dialect)
          payload = default_options.merge(model_params).merge(messages: prompt)

          if @streaming_mode
            payload[:stream] = true

            # Usage is not available in Azure yet.
            # We'll fallback to guess this using the tokenizer.
            payload[:stream_options] = { include_usage: true } if llm_model.provider == "open_ai"
          end

Copy link
Contributor

@tyb-talks tyb-talks left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

😅 actually it's fine, it just omits those params if it's not from OpenAi. 🤔 If the logic in the inherited endpoint changes more specifically for OpenAi though, it would also affect this Mistral endpoint. Still fine for now.

@SamSaffron SamSaffron merged commit 755b63f into main Nov 19, 2024
6 checks passed
@SamSaffron SamSaffron deleted the mistral branch November 19, 2024 06:28
@SamSaffron
Copy link
Member Author

Yeah this is safe by "side effect" cause mistral just sends this info anyway.

@SamSaffron
Copy link
Member Author

thanks for the review @tyb-talks 🤗

Sign up for free to subscribe to this conversation on GitHub. Already have an account? Sign in.

Labels

None yet

Development

Successfully merging this pull request may close these issues.

3 participants