Skip to content

New Problem: Implement Rotary Positional Embeddings in self attention #537

@mavleo96

Description

@mavleo96

I propose to add a question on implementing a self attention block with rotary positional embedding.
Link: RoFormer: Enhanced Transformer with Rotary Position Embedding

(Not sure if this question is already part of some PR)

Metadata

Metadata

Assignees

Labels

No labels
No labels

Projects

No projects

Milestone

No milestone

Relationships

None yet

Development

No branches or pull requests

Issue actions