Skip to content

Block-wise positional embedding #12

@katop1234

Description

@katop1234

Hi,

I was looking to implement the RIN architecture using your code. I noticed you applied positional embedding in every block, as opposed to just once in the beginning. What was the reasoning was behind this?

https://github.com/lucidrains/recurrent-interface-network-pytorch/blob/main/rin_pytorch/rin_pytorch.py#L312

And thank you for writing this code!

Metadata

Metadata

Assignees

No one assigned

    Labels

    No labels
    No labels

    Projects

    No projects

    Milestone

    No milestone

    Relationships

    None yet

    Development

    No branches or pull requests

    Issue actions