-
Notifications
You must be signed in to change notification settings - Fork 3.6k
fix: correct the positional encoding of Transformer in pytorch examples #20203
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Conversation
The difference between this example and the original one pytorch example is that the default value of argument In order to use the positional encoding correctly, I made the above changes. But I don't know how to write a test case. |
Codecov ReportAttention: Patch coverage is
Additional details and impacted files@@ Coverage Diff @@
## master #20203 +/- ##
=========================================
- Coverage 89% 81% -8%
=========================================
Files 267 264 -3
Lines 23084 23029 -55
=========================================
- Hits 20585 18618 -1967
- Misses 2499 4411 +1912 |
@Borda Hi, could you tell me how to deal with the two failed workflows? |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Nice catch!
What does this PR do?
Fixes #19138
Before submitting
PR review
Anyone in the community is welcome to review the PR.
Before you start reviewing, make sure you have read the review guidelines. In short, see the following bullet-list:
Reviewer checklist
📚 Documentation preview 📚: https://pytorch-lightning--20203.org.readthedocs.build/en/20203/