-
Hi, recently I've been trying to learn about graph embedding. I'm trying to do graph embedding for disjunctive graph in Job Shop Scheduling Problem(JSSP). I have 4000+ graphs for a 6x6 JSSP. I'm trying to make a model that can learn how to do embedding for my dataset. I've been looking around in this github and found the autoencoder example. However, from my understanding is that the autoencoder here is being trained for link prediction using only 1 graph as the input. In my case, I want the model to train on multiple graphs from my dataset and maybe use the lower latent dimension of the embedding to do a simple regression to predict the minimum makespan of my JSSP. Can someone maybe give me some insight on where should I start and how to do it? |
Beta Was this translation helpful? Give feedback.
Replies: 1 comment 2 replies
-
It is correct that the autoencoder currently only works on a single graph. There are some modifications necessary to apply it to batches of graphs. Nonetheless, I don't think that the embeddings produced by the autoencoder model are of much use for other down-stream tasks other than link prediction. Instead, you can either train your model directly in an end-to-end fashion against your ground-truth regression values, or you may want to look at our Deep Graph Infomax example (see |
Beta Was this translation helpful? Give feedback.
It is correct that the autoencoder currently only works on a single graph. There are some modifications necessary to apply it to batches of graphs.
Nonetheless, I don't think that the embeddings produced by the autoencoder model are of much use for other down-stream tasks other than link prediction. Instead, you can either train your model directly in an end-to-end fashion against your ground-truth regression values, or you may want to look at our Deep Graph Infomax example (see
examples/dgi.py
). However, this example currently also considers only a single graph as input. It shouldn't be that complex to convert it into a mini-batch scenario though.