Skip to content

Dropout in LeakyParallel #403

@LuposX

Description

@LuposX

Does the dropout parameter in LeakyParallel do anything? From experimentation, even setting the parameter to 1, the model still learns.

The PyTorch RNN documentation, on which LeakyParallel is based, says:

dropout – If non-zero, introduces a Dropout layer on the outputs of each RNN layer except the last layer, with dropout probability equal to dropout. Default: 0

Is the dropout parameter of LeakyParallel ineffective because it only consists of one RNN layer, which is the output layer, and thus the parameter has no effect?

Metadata

Metadata

Assignees

No one assigned

    Labels

    No labels
    No labels

    Projects

    No projects

    Milestone

    No milestone

    Relationships

    None yet

    Development

    No branches or pull requests

    Issue actions