Skip to content

evaluator init kwarg in train.py #38

@eggerdj

Description

@eggerdj

What should we add?

Description

The following trainers are initialized from a trainer

  • TransitionStatesTrainer
  • ReweightingTrainer
  • RecursiveTransitionStates
  • RecursionTrainer

These trainers therefore do not have an evaluator_init in the config from which they initialize since the evaluator is provided by the sub_trainers. This poses a problem in train.py since we cannot pass the initialization arguments for the evaluator. In particular this block of code is an issue

        if evaluator_init_kwargs_str is not None:
            if "evaluator" in conf["trainer_init"]:
                evaluator_cls = EVALUATORS[conf["trainer_init"]["evaluator"]]

                evaluator_init = conf["trainer_init"].get("evaluator_init", dict())
                evaluator_init.update(evaluator_cls.parse_init_kwargs(evaluator_init_kwargs_str))
                conf["trainer_init"]["evaluator_init"] = evaluator_init
            else:
                raise ValueError(
                    f"evaluator_init_kwargs{train_idx} given but no evaluator "
                    f"in trainer {trainer_name}."
                )

Proposed enhancement

To fix this we could modify the from_config methods of the trainers listed above such that they optionally accept an evaluator in their config. If an evaluator is specified then we override the evaluator of the sub_trainer.

Metadata

Metadata

Assignees

No one assigned

    Labels

    No labels
    No labels

    Type

    No type

    Projects

    No projects

    Milestone

    No milestone

    Relationships

    None yet

    Development

    No branches or pull requests

    Issue actions