Skip to content

Conversation

KAVYANSHTYAGI
Copy link
Owner

@KAVYANSHTYAGI KAVYANSHTYAGI commented Sep 7, 2025

Summary

  • compute LR finder gradient with log-spaced learning rates when in exponential mode
  • update tests for log-spacing behavior

Testing

  • pre-commit run --files src/lightning/pytorch/tuner/lr_finder.py tests/tests_pytorch/tuner/test_lr_finder.py
  • PYTHONPATH=src pytest tests/tests_pytorch/tuner/test_lr_finder.py::test_exponential_vs_linear_mode_gradient_difference -q
  • PYTHONPATH=src pytest tests/tests_pytorch/tuner/test_lr_finder.py::test_gradient_correctness -q

https://chatgpt.com/codex/tasks/task_e_68bc780c8a0c83338a4b570d05e155a8


📚 Documentation preview 📚: https://pytorch-lightning--5.org.readthedocs.build/en/5/

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Projects

None yet

Development

Successfully merging this pull request may close these issues.

1 participant