Skip to content

Inquiry on Combining Hybrid O2M and Relation Attention Mask #38

@Anchor1566

Description

@Anchor1566

Question

Thank you for your remarkable work and elegant implementation !

While reproducing and experimenting with your approach, I encountered a question regarding the combination of the hybrid O2M (One-to-Many) mechanism and the proposed relation attention mask. Specifically, I observed that when implemented separately, both components contribute to performance improvements. However, when combined, the results show a noticeable decline. I have tried adjusting the weights of the O2M component, but the issue persists.

Could there be any subtle details or considerations that I might have overlooked when integrating these two components? Any insights or suggestions you could provide would be greatly appreciated.

Thank you in advance for your time and response. Also, I’d like to take this opportunity to wish you an early Happy Chinese New Year! May the coming year bring even more wonderful things to your life and research endeavors.

Additional

No response

Metadata

Metadata

Assignees

No one assigned

    Labels

    questionFurther information is requested

    Projects

    No projects

    Milestone

    No milestone

    Relationships

    None yet

    Development

    No branches or pull requests

    Issue actions