Skip to content
This repository was archived by the owner on Jul 22, 2025. It is now read-only.

Conversation

@keegangeorge
Copy link
Member

This PR fixes an issue where LLM enumerator would error out when SiteSetting.ai_spam_detection = true but there was no AiModerationSetting.spam present.

Typically, we add an LlmDependencyValidator for the setting itself, however, since Spam is unique in that it has it's model set in AiModerationSetting instead of a SiteSetting, we'll add a simple check here to prevent erroring out.

This PR also adds a missing test for the LlmEnumerator

This PR fixes an issue where LLM enumerator would error out when `SiteSetting.ai_spam_detection = true` but there was no `AiModerationSetting.spam` present.

Typically, we add an `LlmDependencyValidator` for the setting itself, however, since Spam is unique in that it has it's model set in `AiModerationSetting` instead of a `SiteSetting`, we'll add a simple check here to prevent erroring out.
@SamSaffron SamSaffron merged commit b480f13 into main Dec 26, 2024
6 checks passed
@SamSaffron SamSaffron deleted the fix-error-spam branch December 26, 2024 22:12
Sign up for free to subscribe to this conversation on GitHub. Already have an account? Sign in.

Labels

None yet

Development

Successfully merging this pull request may close these issues.

4 participants