deep_gcns_torch
1.0.0
在这项工作中,我们提出了成功培训非常深入GCN的新方法。我们从CNN借用概念,主要是残留/密集的连接和扩张的卷积,并将其适应GCN体系结构。通过广泛的实验,我们显示了这些深GCN框架的积极作用。
[Project] [Paper] [幻灯片] [TensorFlow代码] [Pytorch代码]

我们进行了广泛的实验,以显示不同的组件(#layers,#filters,#nearest邻居,扩张等)如何影响DeepGCNs 。我们还提供有关不同类型的深GCN(MRGCN,EDGECONV,GraphSage和Gin)的消融研究。

请查看examples文件夹中每个任务的Readme.md中的详细信息。代码,数据和验证模型的所有信息都可以在此处找到。
examples/ogb_eff/ogbn_arxiv_dgl通过运行安装环境:
source deepgcn_env_install.sh
.
├── misc # Misc images
├── utils # Common useful modules
├── gcn_lib # gcn library
│ ├── dense # gcn library for dense data (B x C x N x 1)
│ └── sparse # gcn library for sparse data (N x C)
├── eff_gcn_modules # modules for mem efficient gnns
├── examples
│ ├── modelnet_cls # code for point clouds classification on ModelNet40
│ ├── sem_seg_dense # code for point clouds semantic segmentation on S3DIS (data type: dense)
│ ├── sem_seg_sparse # code for point clouds semantic segmentation on S3DIS (data type: sparse)
│ ├── part_sem_seg # code for part segmentation on PartNet
│ ├── ppi # code for node classification on PPI dataset
│ └── ogb # code for node/graph property prediction on OGB datasets
│ └── ogb_eff # code for node/graph property prediction on OGB datasets with memory efficient GNNs
└── ...
如果您发现任何有用的话,请引用我们的论文
@InProceedings{li2019deepgcns,
title={DeepGCNs: Can GCNs Go as Deep as CNNs?},
author={Guohao Li and Matthias Müller and Ali Thabet and Bernard Ghanem},
booktitle={The IEEE International Conference on Computer Vision (ICCV)},
year={2019}
}
@article{li2021deepgcns_pami,
title={Deepgcns: Making gcns go as deep as cnns},
author={Li, Guohao and M{"u}ller, Matthias and Qian, Guocheng and Perez, Itzel Carolina Delgadillo and Abualshour, Abdulellah and Thabet, Ali Kassem and Ghanem, Bernard},
journal={IEEE Transactions on Pattern Analysis and Machine Intelligence},
year={2021},
publisher={IEEE}
}
@misc{li2020deepergcn,
title={DeeperGCN: All You Need to Train Deeper GCNs},
author={Guohao Li and Chenxin Xiong and Ali Thabet and Bernard Ghanem},
year={2020},
eprint={2006.07739},
archivePrefix={arXiv},
primaryClass={cs.LG}
}
@InProceedings{li2021gnn1000,
title={Training Graph Neural Networks with 1000 layers},
author={Guohao Li and Matthias Müller and Bernard Ghanem and Vladlen Koltun},
booktitle={International Conference on Machine Learning (ICML)},
year={2021}
}
麻省理工学院许可证
有关更多信息,请联系Guocheng Qian Matthias Muller的Guohao Li。