Skip to content

Training loss logs #393

@FJonske

Description

@FJonske

Hi,

do you happen to have any of your training/validation losses logged somewhere, and is there any chance I can have that log?

I'd like to get a rough feeling for whether hyperparameter setting changes I make have any positive or negative impact on convergence speed, without making a full training run. The reasoning here is that I have a more modern DGX available and can potentially train with more data, and that I intend to make up some speed by adjusting the batch size and learning rate upwards. I'd just like to know whether said runs look somewhat similar to yours in terms of training behavior/loss decrease per computation step.

Best regards,
Frederic

Metadata

Metadata

Assignees

No one assigned

    Labels

    No labels
    No labels

    Type

    No type

    Projects

    No projects

    Milestone

    No milestone

    Relationships

    None yet

    Development

    No branches or pull requests

    Issue actions