Skip to content

Implement Mixed-Curvature Transformers #10

@pchlenski

Description

@pchlenski

Stub classes for mixed-curvature attention:
This would entail implementing all of the following:

Reference code:
There is a pre-existing implementation of sectional curvature in the supplemental materials of the pCurve Your Attention OpenReview submission](https://openreview.net/forum?id=AN5uo4ByWH). The relevant implementations are in code/encoders/FPST/networks.py. The class mappings are:

  • StereographicLayerNorm --> StereographicLayerNorm
  • StereographicAttention --> StereographicAttentionLayer
  • StereographicAct --> implemented in Manifold.apply()
  • StereographicLinear --> Implemented in KappaGCNLayer with A=None
  • StereographicTransfomer --> StereographicTransformer

Paper:
See Cho et al (2024), Learning Mixed-Curvature Representations in Product Spaces for a description of the method.

Metadata

Metadata

Assignees

No one assigned

    Labels

    Projects

    No projects

    Milestone

    No milestone

    Relationships

    None yet

    Development

    No branches or pull requests

    Issue actions