[Under construction].
[Paper]. This work is accepted to ICML'23!
This repository includes the implementation for FavardGNN and OptBasisGNN, two spectral graph neural networks which adapts the polynomial bases for filtering.
If you have any question about our methodology or this repository, please contact me or raise an issue.
- Requirements
- Reproducing Classification Results
- Reproducing Regression Results
- Hyperparams Tuning Scripts
[Under Construction]
Before running the experiments, the folder structure is as below:
.
├── cache
│ └── ckpts
├── data
│ ├── linkx # Code From LINKX repo
├── datasets
│ ├── geom_data
│ ├── linkx
│ └── Planetoid
├── layers
├── models
├── runs
│ └── placeholder.txt
└── utilsRun scripts in the following files under ./ path.
sh> sh scripts/reproduce_favardgnn.sh
sh> sh scripts/reproduce_optbasis.shRun scripts in the following files under ./ path.
sh> sh scripts/reproduce_linkx.shShift working path to Regression/.
sh> cd RegressionStep 1: Prepare images
sh> unzip -d BernNetImages BernNet-LearningFilters-image.zipStep 2: Pre-compute
Pre-compute the matrix polynomials 100x100 grid graph,
and
High-passfilter;Low-passfilter;Band-passfilter;Band-rejectfilter.
sh> python preprocess_matrix_polynomials.pyThe result of this step is saved in the save/ folder.
This step would take several hours,
you can also
download our pre-computed matrices from
this google drive url, and unzip them directly.
sh > mkdir save
sh > # Download cachedMatrices.zip and put it under ./Regresion/save/
sh> unzip save/cachedMatrices.zip -d ./save/
sh> rm ./save/cachedMatrices.zipThe resulted files are:
.save/
├── bandpass_Np=100.pkl
├── bandreject_Np=100.pkl
├── highpass_Np=100.pkl
└── lowpass_Np=100.pklStep 3: Make dataset.
sh> python make_dataset.pyThe result of this step is a pickle file MultiChannelFilterDataset.pkl.
Now we run the regression task! At this moment, the folder structure (ignoring python files) is:
./Regression/
├── BernNet-LearningFilters-image.zip
├── MultiChannelFilterDataset.pkl
└── save
├── bandpass_Np=100.pkl
├── bandreject_Np=100.pkl
├── highpass_Np=100.pkl
└── lowpass_Np=100.pkl
To reproduce Table 5, you can use the bash script below to run over all the samples.
sh> python main_all.pyTo reproduce converging curves as in Figure 2,
you can use the following script to run one or several samples and record the losses.
sh> python main_sample.py[Under Construction]
If you want to test FavardGNN or OptBasisGNN on other datasets, you might need the Optuna script for hyperparameter tuning. Contact me at guoyuhe[at]ruc[dot]edu[dot]cn.
[Under Construction]
Please cite us if our work or repo inspire you.
@inproceedings{DBLP:conf/icml/GuoW23,
author = {Yuhe Guo and
Zhewei Wei},
editor = {Andreas Krause and
Emma Brunskill and
Kyunghyun Cho and
Barbara Engelhardt and
Sivan Sabato and
Jonathan Scarlett},
title = {Graph Neural Networks with Learnable and Optimal Polynomial Bases},
booktitle = {International Conference on Machine Learning, {ICML} 2023, 23-29 July
2023, Honolulu, Hawaii, {USA}},
series = {Proceedings of Machine Learning Research},
volume = {202},
pages = {12077--12097},
publisher = {{PMLR}},
year = {2023},
url = {https://proceedings.mlr.press/v202/guo23i.html},
timestamp = {Wed, 16 Aug 2023 17:14:15 +0200},
biburl = {https://dblp.org/rec/conf/icml/GuoW23.bib},
bibsource = {dblp computer science bibliography, https://dblp.org}
}



