v0.7.0 #385
ValerianRey
started this conversation in
General
v0.7.0
#385
Replies: 0 comments
Sign up for free
to join this conversation on GitHub.
Already have an account?
Sign in to comment
Uh oh!
There was an error while loading. Please reload this page.
-
⚡ Performance update ⚡
In this release, we updated
torchjdto remove some of the unnecessary overhead in the internal code. This should lead to small but noticeable performance improvements (up to 10% speed).We have also made
torchjdmore lightweight, by making optional some dependencies that were only used byCAGradandNashMTL(the changelog explains how to keep installing these dependencies).We have also fixed all internal type errors thanks to
mypy, and we have added apy.typedfile somypycan be used downstream.Changelog
Changed
CAGradandNashMTLto be optional when installingTorchJD. Users of these aggregators will have to use
pip install torchjd[cagrad],pip install torchjd[nash_mtl]orpip install torchjd[full]to install TorchJD alongside those dependencies.This should make TorchJD more lightweight.
autojacpackage protected. The aggregatorsmust now always be imported via their package (e.g.
from torchjd.aggregation.upgrad import UPGradmust be changed tofrom torchjd.aggregation import UPGrad). Thebackwardandmtl_backwardfunctions must nowalways be imported directly from the
torchjdpackage (e.g.from torchjd.autojac.mtl_backward import mtl_backwardmust be changed tofrom torchjd import mtl_backward).nan,infor-infvalues. This check was costly in memory and in time for large matrices so thisshould improve performance. However, if the optimization diverges for some reason (for instance
due to a too large learning rate), the resulting exceptions may come from other sources.
autojacengine.This should lead to a small performance improvement.
Fixed
CAGrad,ConFIG,DualProj,GradDrop,IMTLG,NashMTL,PCGradand
UPGrad) raise aNonDifferentiableErrorwhenever one tries to differentiate through them.Before this change, trying to differentiate through them leaded to wrong gradients or unclear
errors.
Added
py.typedfile in the top package oftorchjdto ensure compliance withPEP 561. This should make it possible for users to use
mypy against the type annotations provided in
torchjd.This discussion was created from the release v0.7.0.
Beta Was this translation helpful? Give feedback.
All reactions