Skip to content

pytorch-optimizer v2.2.1

Choose a tag to compare

@kozistr kozistr released this 28 Jan 11:50
ce56167

Change Log

Feature

  • Support max_grad_norm (Adan optimizer)
  • Support gradient averaging (Lamb optimizer)
  • Support dampening, nesterov parameters (Lars optimizer)

Refactor

  • move step parameter from state to group. (to reduce computation cost & memory)
  • load betas by group, not a parameter.
  • change to in-place operations.

Fix

  • fix when momentum is 0 (Lars optimizer)