Skip to content
Discussion options

You must be logged in to vote

Hi! I got it to work in the meantime. I added this to the main file where I call CLI:

import transformers
from pytorch_lightning.utilities.cli import OPTIMIZER_REGISTRY

@OPTIMIZER_REGISTRY
class Adafactor(transformers.Adafactor):
    def __init__(self, *args: Any, **kwargs: Any) -> None:
        super().__init__(*args, **kwargs)

The main issue was in the config file---apparently one needs to write:

optimizer:
  class_path: __main__.Adafactor

instead of:

optimizer:
  class_path: Adafactor

Doing the former, I got it to work.

By the way, is there a way to have the optimizer register in a separate file than the one that calls CLI?

Replies: 2 comments 1 reply

Comment options

You must be logged in to vote
0 replies
Comment options

You must be logged in to vote
1 reply
@mauvilsa
Comment options

Answer selected by goncalomcorreia
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment