Skip to content
This repository was archived by the owner on Jul 22, 2025. It is now read-only.

Conversation

@keegangeorge
Copy link
Member

@keegangeorge keegangeorge commented Jul 9, 2025

🔍 Overview

This update adds a setting to the configure a default LLM model to be used for all features (unless overridden) and removes the custom: prefix when defining LLM models

➕ More information

For all features, the feature's assigned AI Persona's default_llm_id will be used, however, if it doesn't exist, the SiteSetting.ai_default_llm_model will be used instead.

This update also adds a few migrations which migrates the old model settings for Helper and Summarization into their respective personas. That is:

  • SiteSetting.ai_helper_model's existing value on sites will be copied into:

    AiPersona.where(id: [-18, -19, -20, -21, -22, -23, -24, -25, -26]).default_llm_id
  • SiteSetting.ai_summarization_model's existing value on sites will be copied into

    AiPersona.where(id: [-11, -12]).default_llm_id

Eventually, we will follow-up in a separate commit to remove the following settings:

SiteSetting.ai_helper_model
SiteSetting.ai_helper_image_caption_model
SiteSetting.ai_summarization_model
SiteSetting.ai_embeddings_semantic_search_hyde_model

By default SiteSetting.ai_default_llm_model will have default: "" in it's yaml file, however, this update also adds a migration which will ensure that the last created LLM model is used to fill the setting.

📸 Screenshots

Screenshot 2025-07-09 at 15 27 57 Screenshot 2025-07-09 at 15 27 44

This update adds a setting to the configure a default LLM model to be used for all features (unless overridden).
@keegangeorge keegangeorge marked this pull request as draft July 9, 2025 22:33
@keegangeorge keegangeorge marked this pull request as ready for review July 15, 2025 18:17
Copy link
Member

@romanrizzi romanrizzi left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Should we do ai_translation_model too?

@keegangeorge keegangeorge requested a review from romanrizzi July 17, 2025 21:59
def self.enabled?
SiteSetting.discourse_ai_enabled && SiteSetting.ai_translation_enabled &&
SiteSetting.ai_translation_model.present? &&
SiteSetting.ai_default_llm_model.present? &&
Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Need to check if the persona has an LLM set

Copy link
Member Author

@keegangeorge keegangeorge Jul 18, 2025

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

which one 😅

  ai_translation_locale_detector_persona:
    default: "-27"
    type: enum
    enum: "DiscourseAi::Configuration::PersonaEnumerator"
    area: "ai-features/translation"
  ai_translation_post_raw_translator_persona:
    default: "-28"
    type: enum
    enum: "DiscourseAi::Configuration::PersonaEnumerator"
    area: "ai-features/translation"
  ai_translation_topic_title_translator_persona:
    default: "-29"
    type: enum
    enum: "DiscourseAi::Configuration::PersonaEnumerator"
    area: "ai-features/translation"
  ai_translation_short_text_translator_persona:
    default: "-30"
    type: enum
    enum: "DiscourseAi::Configuration::PersonaEnumerator"
    area: "ai-features/translation"

I suppose all of them must have a default_llm_id OR SiteSetting.ai_default_llm_model needs to be present?

@keegangeorge keegangeorge requested a review from romanrizzi July 18, 2025 19:31
@davidtaylorhq
Copy link
Member

davidtaylorhq commented Jul 22, 2025

Plugin moving to core. This will need to be re-opened against discourse/discourse. Sorry @keegangeorge

Sign up for free to subscribe to this conversation on GitHub. Already have an account? Sign in.

Labels

None yet

Development

Successfully merging this pull request may close these issues.

4 participants