Skip to content

Show visual indication when score is below --fail-under threshold#10856

Open
danielalanbates wants to merge 2 commits intopylint-dev:mainfrom
danielalanbates:fix/issue-8503
Open

Show visual indication when score is below --fail-under threshold#10856
danielalanbates wants to merge 2 commits intopylint-dev:mainfrom
danielalanbates:fix/issue-8503

Conversation

@danielalanbates
Copy link

Summary

  • When the score is below the --fail-under threshold, the evaluation output now includes a visual message: Your score X.XX is below the fail-under threshold of Y.Y
  • Previously, pylint would exit with code 16 but give no visual indication in the terminal, making it unclear why the command failed
  • Added a test verifying the message appears when below threshold and does not appear when at or above threshold

Before:

------------------------------------
Your code has been rated at 7.19/10

After:

--------------------------------------------------------------------------------------------
Your code has been rated at 7.19/10
Your score 7.19 is below the fail-under threshold of 10.0

Closes #8503

Test plan

  • Existing test_fail_under tests pass (exit codes unchanged)
  • Existing test_fail_on parametrized tests pass (20 cases)
  • New test_fail_under_visual_indication test verifies the message is shown when score < threshold
  • New test verifies the message is NOT shown when score >= threshold
  • Manual testing confirms the visual output

This PR was created with the assistance of Claude Opus 4.6 by Anthropic. Happy to make any adjustments! Reviewed and submitted by a human.

When the score is below the --fail-under threshold, the evaluation
output now includes a message indicating that the score is below
the configured threshold. Previously, pylint would exit with code 16
but give no visual indication in the terminal output.

Closes pylint-dev#8503

Co-Authored-By: Claude Opus 4.6 <noreply@anthropic.com>
Copy link
Member

@Pierre-Sassoulas Pierre-Sassoulas left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Thank you for working on pylint. As for all llm generated MR this seems sensible but we need to check it in detail to make sure it's the right place to do this. In this case there's probably another place where the under threshold condition is checked that I feel could work better but maybe the reporting is not imbricated with the exit logic even if everything happens in pylinter. I'm on mobile so i'll check later.

@github-actions

This comment has been minimized.

@Pierre-Sassoulas Pierre-Sassoulas added the Enhancement ✨ Improvement to a component label Feb 20, 2026
@Pierre-Sassoulas Pierre-Sassoulas added this to the 4.1.0 milestone Feb 20, 2026
@Pierre-Sassoulas
Copy link
Member

Copy link
Member

@jacobtylerwalls jacobtylerwalls left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Once the tests are passing, this LGTM

@danielalanbates
Copy link
Author

Thank you for the review! Glad it helps. 🙏

@jacobtylerwalls
Copy link
Member

👋 Do you plan to update the failing tests?

The test was using the default fail-under threshold of 10. With the new
visual indication feature (score < threshold shows a message), the
existing test output broke because 0.00 < 10.

Set fail-under to -10 in the test since it is testing multi-format
output, not the fail-under threshold feature.

Co-Authored-By: Claude Opus 4.6 <noreply@anthropic.com>
Copy link
Member

@Pierre-Sassoulas Pierre-Sassoulas left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Sorry for the delay in reviewing, thank you for working on pylint again.

Comment on lines +1214 to +1219
if note is not None and note < self.config.fail_under:
msg += (
f"\nYour score {note:.2f} is below the"
f" fail-under threshold of {self.config.fail_under}"
)

Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Imo this should go in run.py:

pylint/pylint/lint/run.py

Lines 248 to 253 in 4f62b34

if score_value >= linter.config.fail_under:
sys.exit(0)
else:
# We need to make sure we return a failing exit code in this case.
# So we use self.linter.msg_status if that is non-zero, otherwise we just return 1.
sys.exit(self.linter.msg_status or 1)

There's already a check done here, let's keep this logic in one place.

@danielalanbates
Copy link
Author

Thank you for the review! Glad it helps. 🙏

@codecov
Copy link

codecov bot commented Mar 9, 2026

Codecov Report

✅ All modified and coverable lines are covered by tests.
✅ Project coverage is 96.04%. Comparing base (d2dc5df) to head (1f48a1b).
⚠️ Report is 38 commits behind head on main.

Additional details and impacted files

Impacted file tree graph

@@           Coverage Diff           @@
##             main   #10856   +/-   ##
=======================================
  Coverage   96.03%   96.04%           
=======================================
  Files         177      177           
  Lines       19621    19630    +9     
=======================================
+ Hits        18844    18853    +9     
  Misses        777      777           
Files with missing lines Coverage Δ
pylint/lint/pylinter.py 96.32% <100.00%> (+0.01%) ⬆️

... and 10 files with indirect coverage changes

🚀 New features to boost your workflow:
  • ❄️ Test Analytics: Detect flaky tests, report on failures, and find test suite problems.

@github-actions
Copy link
Contributor

github-actions bot commented Mar 9, 2026

🤖 According to the primer, this change has no effect on the checked open source code. 🤖🎉

This comment was generated for commit 1f48a1b

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

Enhancement ✨ Improvement to a component

Projects

None yet

Development

Successfully merging this pull request may close these issues.

--fail-under has no visual impact on terminal output

3 participants