Skip to content

Conversation

@AbdullahKIRMAN
Copy link
Collaborator

This pull request makes improvements to the TransformersModerator integration, specifically in the model loading process to better handle dynamic imports and dependencies. The main change ensures that if PyTorch (torch) is loaded during runtime and wasn't present before, the transformers module is reloaded to avoid potential issues with framework detection.

Dependency and dynamic import handling:

  • In src/moderators/integrations/transformers_moderator.py, added logic to check if torch was loaded during runtime, and if so, reloads the transformers module to ensure proper framework detection and compatibility.
  • Imported importlib and sys to support dynamic reloading and module inspection for dependency management.

@fcakyon fcakyon merged commit dda4162 into main Oct 10, 2025
5 checks passed
@fcakyon fcakyon deleted the torch-bug-fix branch October 10, 2025 01:02
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

None yet

Projects

None yet

Development

Successfully merging this pull request may close these issues.

3 participants