Skip to content

Conversation

@Maharnab-Saikia
Copy link

@Maharnab-Saikia Maharnab-Saikia commented Jan 7, 2025

What does this PR do?

This PR adds an option to disable the NSFW safety_checker in the LoRA training evaluation script. This feature is implemented via a new parser argument --disable_safety_checker. When enabled, the safety_checker is set to None, preventing black images caused by false-positive NSFW detections during evaluation.

Motivation and Context

During evaluation, the safety_checker sometimes incorrectly flags images as NSFW, resulting in black outputs even when the images are safe. This change allows users to disable the safety_checker as needed, improving flexibility and usability.

Dependencies

No new dependencies are introduced with this change.

Before submitting

@HuggingFaceDocBuilderDev

The docs for this PR live here. All of your documentation changes will be reflected on that endpoint. The docs are available until 30 days after the last update.

@Maharnab-Saikia Maharnab-Saikia deleted the fix/black-images-safety-checker branch January 11, 2025 12:12
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

None yet

Projects

None yet

Development

Successfully merging this pull request may close these issues.

2 participants