GPU usage of drop edge #4054
Answered
by
rusty1s
songyesog2000
asked this question in
Q&A
-
Sorry for asking the dumb question here since i am a newbee for programming. I met a slow running on GPU using drop edge in each layer of my nn model. Drop edge documentation here. I wonder is this problem caused by the cuda compatibility of the function or the generation of the random bool value. Thank you in advance for checking this for me and maintaining such an amazing package! |
Beta Was this translation helpful? Give feedback.
Answered by
rusty1s
Feb 11, 2022
Replies: 1 comment 1 reply
-
Thanks for reporting. There were indeed some issues that slowed down GNN execution (in particular when |
Beta Was this translation helpful? Give feedback.
1 reply
Answer selected by
songyesog2000
Sign up for free
to join this conversation on GitHub.
Already have an account?
Sign in to comment
Thanks for reporting. There were indeed some issues that slowed down GNN execution (in particular when
force_undirected=True
set). I fixed this in #4059.