-
Notifications
You must be signed in to change notification settings - Fork 12
Open
Description
What should we add?
Description
The following trainers are initialized from a trainer
- TransitionStatesTrainer
- ReweightingTrainer
- RecursiveTransitionStates
- RecursionTrainer
These trainers therefore do not have an evaluator_init in the config from which they initialize since the evaluator is provided by the sub_trainers. This poses a problem in train.py since we cannot pass the initialization arguments for the evaluator. In particular this block of code is an issue
if evaluator_init_kwargs_str is not None:
if "evaluator" in conf["trainer_init"]:
evaluator_cls = EVALUATORS[conf["trainer_init"]["evaluator"]]
evaluator_init = conf["trainer_init"].get("evaluator_init", dict())
evaluator_init.update(evaluator_cls.parse_init_kwargs(evaluator_init_kwargs_str))
conf["trainer_init"]["evaluator_init"] = evaluator_init
else:
raise ValueError(
f"evaluator_init_kwargs{train_idx} given but no evaluator "
f"in trainer {trainer_name}."
)Proposed enhancement
To fix this we could modify the from_config methods of the trainers listed above such that they optionally accept an evaluator in their config. If an evaluator is specified then we override the evaluator of the sub_trainer.
Reactions are currently unavailable
Metadata
Metadata
Assignees
Labels
No labels