Skip to content

position-wise feedforward only one linear layer? #16

@yuanenming

Description

@yuanenming

def positionwise_ffn(self, inp, activation_type='relu'):

In your implementation, the FFN module only has one linear layer. is it a bug?

Metadata

Metadata

Assignees

No one assigned

    Labels

    No labels
    No labels

    Projects

    No projects

    Milestone

    No milestone

    Relationships

    None yet

    Development

    No branches or pull requests

    Issue actions