-
Notifications
You must be signed in to change notification settings - Fork 18
Description
Question
Thank you for your remarkable work and elegant implementation !
While reproducing and experimenting with your approach, I encountered a question regarding the combination of the hybrid O2M (One-to-Many) mechanism and the proposed relation attention mask. Specifically, I observed that when implemented separately, both components contribute to performance improvements. However, when combined, the results show a noticeable decline. I have tried adjusting the weights of the O2M component, but the issue persists.
Could there be any subtle details or considerations that I might have overlooked when integrating these two components? Any insights or suggestions you could provide would be greatly appreciated.
Thank you in advance for your time and response. Also, I’d like to take this opportunity to wish you an early Happy Chinese New Year! May the coming year bring even more wonderful things to your life and research endeavors.
Additional
No response