Skip to content
This repository was archived by the owner on Jul 22, 2025. It is now read-only.

Conversation

@SamSaffron
Copy link
Member

@SamSaffron SamSaffron commented Oct 21, 2024

Also, properly set LLM when swapping persona and ensure PM
recipients are properly updated

New UI is:

image

And:

image

This means you can operate with 100% tethered personas and the LLM selector is not required.

Cleans up UI so it is clearer

Also, properly set LLM when swapping persona and ensure PM
recipients are properly updated
@SamSaffron
Copy link
Member Author

will sort out system test tomorrow.

module DiscourseAi
module Configuration
class LlmValidator
def self.global_usage
Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Maybe move this over to LlmEnumerator?

Copy link
Member Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

will move

if (scope && scope[:llm_usage])
scope[:llm_usage]
else
DiscourseAi::Configuration::LlmValidator.global_usage
Copy link
Member

@romanrizzi romanrizzi Oct 21, 2024

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Might be better to use {} here? Users of this serializer should provide the usage in the scope after all.

Copy link
Member Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

maybe but then we would need amend all the usages in the controller which is a bigger change.

@SamSaffron SamSaffron merged commit a1f859a into main Oct 22, 2024
5 checks passed
@SamSaffron SamSaffron deleted the bug-fixes-force-llm branch October 22, 2024 00:16
Sign up for free to subscribe to this conversation on GitHub. Already have an account? Sign in.

Labels

None yet

Development

Successfully merging this pull request may close these issues.

3 participants