Replies: 1 comment 4 replies
-
Yes, this looks correct. The model will process 64 graphs at once (as defined by |
Beta Was this translation helpful? Give feedback.
4 replies
Sign up for free
to join this conversation on GitHub.
Already have an account?
Sign in to comment
Uh oh!
There was an error while loading. Please reload this page.
-
I am working on GNN for node regression.
I create my graphs as follows :
nb_of_data_objects = 1000
for i in range(nb_of_data_objects):
# Define edge indices
edge_indices = …..
# Define target values
target_values = …..
# Define node_features
node_features = …..
# Create the Data object
data = Data(x=node_features, edge_index=edge_indices, y=target_values)
data_objects.append(data)
print("Data objects", data_objects)
Then normalise and create data loader
Create a DataLoader for the training/testing data
train_loader = DataLoader(train_data, batch_size=64, shuffle=True)
test_loader = DataLoader(test_data, batch_size=64, shuffle=False)
Is this the correct way to enter the training loop? Does the model takes one graph at a time? # Create the GNN model instance
model = GCNModel(1, hidden_size, 1, num_layers)
Define the loss function
criterion = nn.L1Loss(reduction='none')
Define the optimizer
optimizer = optim.Adam(model.parameters(), lr=lr)
Training loop
for epoch in range(num_epochs):
print ("epoch", epoch)
model.train()
running_loss = 0.0
total_nodes = 0
for batch_idx, data_batch in enumerate(train_loader):
optimizer.zero_grad()
for epoch in range(num_epochs):
model.train()
running_loss = 0.0
for batch_idx, data_batch in enumerate(train_loader):
Beta Was this translation helpful? Give feedback.
All reactions