You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
This patch adds a num_workers flag to es_trainer. This better matches
the behavior of the train_locally script, which is important for some
internal scripts for distributed training. Choosing the number of
workers based on the number of perturbations also does not take into
consideration the underlying hardware at all, which should be what
determines the worker count. The current logic already did not take into
account antithetic sampling doubling the number of models to evaluate
per iteration.
0 commit comments