Based on the article proposed by Gilmer et al. [1] and the code of nmp_qc project from Pau Riba and Anjan Dutta, we explored some other methods to improve the performance. The methodology includes two parts: Feature Engineering and Network Architecture Design.
-
Feature Engineering
-
Network Architecture Design
The main part of these methods implemented could be found in models/MPNN_attn.py and datasets/utils.py.
$ pip install -r requirements.txt
$ python main.py
$ python main.py --epochs 100 --mpnn --e_rep 6
$ python main.py --epochs 100 --mpnnattn --method_attn 3 --num_heads 8 --e_rep 6
(method_attn: 1, 2, 3, 4, 5 supported now)
(e_rep: 1, 2, 3, 4, 5, 6 supported now)
More argument information could be find in main.py.
Running any experiment using QM9 dataset needs installing the rdkit package, which can be done following the instructions available here
The data used in this project can be downloaded here.
- [1] Gilmer et al., Neural Message Passing for Quantum Chemistry, arXiv, 2017.
- [2] Petar Veličković et al., Graph attention networks, arXiv, 2017.
- [3] Brody et al., How Attentive are Graph Attention Networks?, arXiv, 2022.
- Buyu Zhang (@wei0py)
- Zheyu Lu (@Nsigma-Bill)