Skip to content
Discussion options

You must be logged in to vote

Its not easy to give a definitive answer here. The difference in accuracy might be down to your data or the number of layers you are using. Here are some thoughts on your 2 questions.

  • ChebConv with degree K and GCNConv with K layers aren't the same. Each GCNConv layer is aggregating node embeddings from the previous GCNConv layer. But ChebConv always aggregates raw node embeddings.

  • GCNConv tends to over-smooth representations when there are many layers. But ChebConv can potentially avoid that because of how its output is a "combination of k-hop node embeddings".

Two things you could try/check.

  1. Look at the homophily of your labels. If the homophily is low, then that could be the re…

Replies: 1 comment 1 reply

Comment options

You must be logged in to vote
1 reply
@chinmay5
Comment options

Answer selected by chinmay5
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Category
Q&A
Labels
None yet
2 participants