-
Notifications
You must be signed in to change notification settings - Fork 1.8k
Open
Description
Hello, I've been working on seq2seq and spotted that here
https://github.com/yandexdataschool/Practical_RL/blob/master/week07_seq2seq/basic_model_torch.py#L36
We're getting a hidden state from the first token after eos. Because 2 lines earlier infer_length is called with include_eos=True so the returned length points to the token after eos.
If it's intentional can you please explain what's the point of getting hidden state that includes information about first token that isn't part of the sequence? Otherwise I believe 1 should be just subtracted from every element of end_index not only from the elements that go out of range.
Metadata
Metadata
Assignees
Labels
No labels