-
I'm new to pyg and would like to do edge regression on a graph, more specifically flows between nodes. I.e. I have an undirected graph with one target value per edge (the sign of which indicates the direction of the flow). I'm not sure how to write this in code - is there a way to achieve this without having to specify a target value for both directions in the graph? What confuses me is the undirected nature of the underlying data graph, which seems to require that I specify target values for both directions. I haven't been able to find a simple example of edge regression, I'd be happy to be directed to some code. |
Beta Was this translation helpful? Give feedback.
Replies: 5 comments 2 replies
-
I think you have two options:
out = self.MLP(x[edge_index[0]] * x[edge_index[1]])
mask = edge_index[0] >= edge_index[1]
out = self.MLP(torch.cat([x[edge_index[0, mask]], x[edge_index[1, mask]]], dim=-1))
y = y[mask] |
Beta Was this translation helpful? Give feedback.
-
I think the formatting got a bit messed up there, could you fix that? |
Beta Was this translation helpful? Give feedback.
-
Sorry no I'm a bit too new to this to understand your example. Perhaps showing my current code will help give you an idea of my current level of knowledge.
|
Beta Was this translation helpful? Give feedback.
-
It could be something like class GNN(torch.nn.Module):
name = "gnn"
def __init__(self, hidden_channels: int = None) -> None:
super().__init__()
torch.manual_seed(1234)
TOPOLOGY = get_static_graph()
nr_of_nodes = TOPOLOGY.number_of_nodes()
hidden_channels = hidden_channels or nr_of_nodes
self.conv1 = GCNConv(-1, hidden_channels)
self.conv2 = GCNConv(hidden_channels, hidden_channels)
self.lin = Linear(2 * hidden_channels, 1)
def forward(self, x, edge_index):
x = self.conv1(x, edge_index)
x = F.relu(x)
x = self.conv2(x, edge_index)
x = F.relu(x)
edge_x = torch.cat([x[edge_index[0]], x[edge_index[1]]], dim=-1)
return self.lin(edge_x)
model = GNN(20)
criterion = torch.nn.MSELoss()
for snapshot in samples:
# snapshot.x contains node features. edge_index is always the same (static graph)
out = model(snapshot.x, snapshot.edge_index)
loss = criterion(out, snapshot.flows)
loss.backward()
optimizer.step()
optimizer.zero_grad() |
Beta Was this translation helpful? Give feedback.
-
Ah, thanks! So, do I understand correctly that
? |
Beta Was this translation helpful? Give feedback.
I think you have two options: