Skip to content
This repository was archived by the owner on Apr 25, 2023. It is now read-only.

A question about the nn.Embedding #30

@zhang-qiang-github

Description

@zhang-qiang-github

Thank you for sharing this project code, and I have a question for nn.Embedding.

In this project, the shape of src and trg is (maxLen, batch size). The forward of Encoder is:

    def forward(self, src, hidden=None):
        embedded = self.embed(src)
        outputs, hidden = self.gru(embedded, hidden)
        # sum bidirectional outputs
        outputs = (outputs[:, :, :self.hidden_size] +
                   outputs[:, :, self.hidden_size:])
        return outputs, hidden

When I debug it, the shape of src is (37, 32), in which 32 is the batch size.
However, when I read the explanation of nn.Embedding, the example code shows:

>>> # a batch of 2 samples of 4 indices each
>>> input = torch.LongTensor([[1,2,4,5],[4,3,2,9]])
>>> embedding(input)

Thus, the input of Embedding should be (batch size, maxLen).

This problem make me very confuzed.

Any suggestion is apprciated!

Metadata

Metadata

Assignees

No one assigned

    Labels

    No labels
    No labels

    Projects

    No projects

    Milestone

    No milestone

    Relationships

    None yet

    Development

    No branches or pull requests

    Issue actions