Skip to content

seq2seq attention中代码的问题 #2

@wmathor

Description

@wmathor

请问下面这个代码
embedded = self.embedding(x).view(self.sentence_length, 1, -1) # seq_len * batch_size * word_size
注释是不是有问题,我觉得应该是seq_len * 1 * word_size
而且,view成这样,能对batch_size > 1的样本进行训练吗

Metadata

Metadata

Assignees

No one assigned

    Labels

    No labels
    No labels

    Projects

    No projects

    Milestone

    No milestone

    Relationships

    None yet

    Development

    No branches or pull requests

    Issue actions