Sparse topk
#3397
Replies: 2 comments 6 replies
-
It is implemented in pytorch, I think this is what you are looking for : https://pytorch.org/docs/stable/generated/torch.topk.html |
Beta Was this translation helpful? Give feedback.
1 reply
-
There exists a An alternative approach is to utilize PyTorch's x, mask = to_dense_batch(x, batch)
torch.topk(x, k=2, dim=1) |
Beta Was this translation helpful? Give feedback.
5 replies
Sign up for free
to join this conversation on GitHub.
Already have an account?
Sign in to comment
Uh oh!
There was an error while loading. Please reload this page.
Uh oh!
There was an error while loading. Please reload this page.
-
Hi,
I was wondering if is there an easy way to find the topk values of a prediction in pytorch geometric. Example imagine that you have the following data structure
data.x = [0.2, 0.1, 0.1, 0.4, 0.4 , 0.05, 0.05, 0.1, 0.05, 0,01, 0.2, 0.15, 0.05, 0.4]
data.batch = [0, 0, 0, 0, 0, 1, 1, 1, 1, 1, 1, 1, 1, 1]
what I would like to do for instance is find the k top values independently for each sample. I would expected something like:
k = 2
data.topk = [ 0.4, 0.4, 0.4, 0.2]
data.batch = [0, 0, 1, 1]
Is there any method in pytorch geometric to do that? I have been checking the documentation but I was not able to find something like this.
EDIT:
I found that there is this method https://pytorch-geometric.readthedocs.io/en/latest/_modules/torch_geometric/nn/glob/sort.html#global_sort_pool that it should do the trick, however, it is not returning the location of the maximum values, and this is something that I should need to recover the new batch. Is there any method that I am missing?
Thanks!
Beta Was this translation helpful? Give feedback.
All reactions