Official implementation of Rethinking Graph Neural Architecture Search from Message-passing (CVPR2021)

Related tags

Deep LearningGNAS-MP
Overview

Rethinking Graph Neural Architecture Search from Message-passing

Intro

The GNAS can automatically learn better architecture with the optimal depth of message passing on the graph. Specifically, we design Graph Neural Architecture Paradigm (GAP) with tree-topology computation procedure and two types of fine-grained atomic operations (feature filtering & neighbor aggregation) from message-passing mechanism to construct powerful graph network search space. Feature filtering performs adaptive feature selection, and neighbor aggregation captures structural information and calculates neighbors’ statistics. Experiments show that our GNAS can search for better GNNs with multiple message-passing mechanisms and optimal message-passing depth.

Getting Started

0. Prerequisites

  • Linux
  • NVIDIA GPU + CUDA CuDNN

1. Setup Python environment for GPU

# clone Github repo
conda install git
git clone https://github.com/phython96/GNAS-MP.git
cd GNAS-MP

# Install python environment
conda env create -f environment_gpu.yml
conda activate gnas

2. Download datasets

The datasets are provided by project benchmarking-gnns, you can click here to download all the required datasets.

3. Searching

We have provided scripts for easily searching graph neural networks on five datasets.

# searching on ZINC dataset at graph regression task
sh scripts/search_molecules_zinc.sh [gpu_id]

# searching on SBMs_PATTERN dataset at node classification task
sh scripts/search_sbms_pattern.sh [gpu_id]

# searching on SBMs_CLUSTER dataset at node classification task
sh scripts/search_sbms_cluster.sh [gpu_id]

# searching on MNIST dataset at graph classification task
sh scripts/search_superpixels_mnist.sh [gpu_id]

# searching on CIFAR10 dataset at graph classification task
sh scripts/search_superpixels_cifar10.sh [gpu_id]

When the search procedure is finished, you need to copy the searched genotypes from file "./save/[data_name]_search.txt" to "./configs/genotypes.py".

For example, we have searched on MNIST dataset, and obtain genotypes result file "./save/MNIST_search.txt".

Epoch : 19
[Genotype(alpha_cell=[('f_sparse', 1, 0), ('f_dense', 2, 1), ('f_sparse', 3, 2), ('a_max', 4, 1), ('a_max', 5, 2), ('a_max', 6, 3), ('f_sparse', 7, 6), ('f_identity', 8, 7), ('f_dense', 9, 7)], concat_node=None), Genotype(alpha_cell=[('f_sparse', 1, 0), ('f_dense', 2, 0), ('f_dense', 3, 0), ('a_max', 4, 1), ('a_max', 5, 2), ('a_max', 6, 3), ('f_identity', 7, 5), ('f_identity', 8, 6), ('f_sparse', 9, 8)], concat_node=None), Genotype(alpha_cell=[('f_sparse', 1, 0), ('f_sparse', 2, 0), ('f_sparse', 3, 0), ('a_max', 4, 1), ('a_max', 5, 2), ('a_max', 6, 3), ('f_sparse', 7, 4), ('f_sparse', 8, 6), ('f_identity', 9, 4)], concat_node=None), Genotype(alpha_cell=[('f_sparse', 1, 0), ('f_sparse', 2, 1), ('f_sparse', 3, 1), ('a_max', 4, 1), ('a_max', 5, 2), ('a_max', 6, 3), ('f_sparse', 7, 4), ('f_identity', 8, 7), ('f_sparse', 9, 4)], concat_node=None)]
Epoch : 20
[Genotype(alpha_cell=[('f_sparse', 1, 0), ('f_dense', 2, 1), ('f_sparse', 3, 2), ('a_max', 4, 1), ('a_max', 5, 2), ('a_max', 6, 3), ('f_sparse', 7, 6), ('f_identity', 8, 7), ('f_sparse', 9, 8)], concat_node=None), Genotype(alpha_cell=[('f_sparse', 1, 0), ('f_sparse', 2, 1), ('f_dense', 3, 0), ('a_max', 4, 1), ('a_max', 5, 2), ('a_max', 6, 3), ('f_identity', 7, 5), ('f_identity', 8, 6), ('f_sparse', 9, 8)], concat_node=None), Genotype(alpha_cell=[('f_sparse', 1, 0), ('f_sparse', 2, 0), ('f_sparse', 3, 0), ('a_max', 4, 1), ('a_max', 5, 2), ('a_max', 6, 3), ('f_sparse', 7, 4), ('f_dense', 8, 4), ('f_sparse', 9, 6)], concat_node=None), Genotype(alpha_cell=[('f_sparse', 1, 0), ('f_sparse', 2, 1), ('f_sparse', 3, 2), ('a_max', 4, 1), ('a_max', 5, 2), ('a_max', 6, 3), ('f_sparse', 7, 4), ('f_sparse', 8, 6), ('f_sparse', 9, 8)], concat_node=None)]
Epoch : 21
[Genotype(alpha_cell=[('f_sparse', 1, 0), ('f_dense', 2, 1), ('f_sparse', 3, 2), ('a_max', 4, 1), ('a_max', 5, 2), ('a_max', 6, 3), ('f_sparse', 7, 6), ('f_identity', 8, 7), ('f_sparse', 9, 8)], concat_node=None), Genotype(alpha_cell=[('f_sparse', 1, 0), ('f_dense', 2, 0), ('f_dense', 3, 0), ('a_max', 4, 1), ('a_max', 5, 2), ('a_max', 6, 3), ('f_identity', 7, 5), ('f_identity', 8, 6), ('f_identity', 9, 6)], concat_node=None), Genotype(alpha_cell=[('f_sparse', 1, 0), ('f_sparse', 2, 0), ('f_sparse', 3, 0), ('a_max', 4, 1), ('a_max', 5, 2), ('a_max', 6, 3), ('f_sparse', 7, 6), ('f_identity', 8, 4), ('f_identity', 9, 7)], concat_node=None), Genotype(alpha_cell=[('f_sparse', 1, 0), ('f_sparse', 2, 1), ('f_sparse', 3, 2), ('a_max', 4, 1), ('a_max', 5, 2), ('a_max', 6, 3), ('f_sparse', 7, 4), ('f_sparse', 8, 6), ('f_identity', 9, 4)], concat_node=None)]

Copy the fourth line from the above file and paste it into "./configs/genotypes.py" with the prefix "MNIST = ".

MNIST_Net = [Genotype(alpha_cell=[('f_sparse', 1, 0), ('f_dense', 2, 1), ('f_sparse', 3, 2), ('a_max', 4, 1), ('a_max', 5, 2), ('a_max', 6, 3), ('f_sparse', 7, 6), ('f_identity', 8, 7), ('f_sparse', 9, 8)], concat_node=None), Genotype(alpha_cell=[('f_sparse', 1, 0), ('f_sparse', 2, 1), ('f_dense', 3, 0), ('a_max', 4, 1), ('a_max', 5, 2), ('a_max', 6, 3), ('f_identity', 7, 5), ('f_identity', 8, 6), ('f_sparse', 9, 8)], concat_node=None), Genotype(alpha_cell=[('f_sparse', 1, 0), ('f_sparse', 2, 0), ('f_sparse', 3, 0), ('a_max', 4, 1), ('a_max', 5, 2), ('a_max', 6, 3), ('f_sparse', 7, 4), ('f_dense', 8, 4), ('f_sparse', 9, 6)], concat_node=None), Genotype(alpha_cell=[('f_sparse', 1, 0), ('f_sparse', 2, 1), ('f_sparse', 3, 2), ('a_max', 4, 1), ('a_max', 5, 2), ('a_max', 6, 3), ('f_sparse', 7, 4), ('f_sparse', 8, 6), ('f_sparse', 9, 8)], concat_node=None)]

4. Training

Before training, you must confim that there is a genotype of searched graph neural network in file "./configs/genotypes.py".

We provided scripts for easily training graph neural networks searched by GNAS.

# training on ZINC dataset at graph regression task
sh scripts/train_molecules_zinc.sh [gpu_id]

# training on SBMs_PATTERN dataset at node classification task
sh scripts/train_sbms_pattern.sh [gpu_id]

# training on SBMs_CLUSTER dataset at node classification task
sh scripts/train_sbms_cluster.sh [gpu_id]

# training on MNIST dataset at graph classification task
sh scripts/train_superpixels_mnist.sh [gpu_id]

# training on CIFAR10 dataset at graph classification task
sh scripts/train_superpixels_cifar10.sh [gpu_id]

Results

Visualization

Here, we show 4-layer graph neural networks searched by GNAS on five datasets at three graph tasks.

Reference

to be updated

Owner
Shaofei Cai
Retired ICPC contestant, classic algorithm enthusiast.
Shaofei Cai
Learning Temporal Consistency for Low Light Video Enhancement from Single Images (CVPR2021)

StableLLVE This is a Pytorch implementation of "Learning Temporal Consistency for Low Light Video Enhancement from Single Images" in CVPR 2021, by Fan

99 Dec 19, 2022
The code for the NeurIPS 2021 paper "A Unified View of cGANs with and without Classifiers".

Energy-based Conditional Generative Adversarial Network (ECGAN) This is the code for the NeurIPS 2021 paper "A Unified View of cGANs with and without

sianchen 22 May 28, 2022
Explainable Zero-Shot Topic Extraction

Zero-Shot Topic Extraction with Common-Sense Knowledge Graph This repository contains the code for reproducing the results reported in the paper "Expl

D2K Lab 56 Dec 14, 2022
An implementation for the loss function proposed in Decoupled Contrastive Loss paper.

Decoupled-Contrastive-Learning This repository is an implementation for the loss function proposed in Decoupled Contrastive Loss paper. Requirements P

Ramin Nakhli 71 Dec 04, 2022
Attentive Implicit Representation Networks (AIR-Nets)

Attentive Implicit Representation Networks (AIR-Nets) Preprint | Supplementary | Accepted at the International Conference on 3D Vision (3DV) teaser.mo

29 Dec 07, 2022
The official implementation of VAENAR-TTS, a VAE based non-autoregressive TTS model.

VAENAR-TTS This repo contains code accompanying the paper "VAENAR-TTS: Variational Auto-Encoder based Non-AutoRegressive Text-to-Speech Synthesis". Sa

THUHCSI 138 Oct 28, 2022
Self-Supervised Learning with Kernel Dependence Maximization

Self-Supervised Learning with Kernel Dependence Maximization This is the code for SSL-HSIC, a self-supervised learning loss proposed in the paper Self

DeepMind 29 Dec 29, 2022
Video-Music Transformer

VMT Video-Music Transformer (VMT) is an attention-based multi-modal model, which generates piano music for a given video. Paper https://arxiv.org/abs/

Chin-Tung Lin 5 Jul 13, 2022
PyTorch implementation of the R2Plus1D convolution based ResNet architecture described in the paper "A Closer Look at Spatiotemporal Convolutions for Action Recognition"

R2Plus1D-PyTorch PyTorch implementation of the R2Plus1D convolution based ResNet architecture described in the paper "A Closer Look at Spatiotemporal

Irhum Shafkat 342 Dec 16, 2022
You Only 👀 One Sequence

You Only 👀 One Sequence TL;DR: We study the transferability of the vanilla ViT pre-trained on mid-sized ImageNet-1k to the more challenging COCO obje

Hust Visual Learning Team 666 Jan 03, 2023
A Vision Transformer approach that uses concatenated query and reference images to learn the relationship between query and reference images directly.

A Vision Transformer approach that uses concatenated query and reference images to learn the relationship between query and reference images directly.

24 Dec 13, 2022
IA for recognising Traffic Signs using Keras [Tensorflow]

Traffic Signs Recognition ⚠️ 🚦 Fundamentals of Intelligent Systems Introduction 📄 Development of a neural network capable of recognizing nine differ

Sebastián Fernández García 2 Dec 19, 2022
Cross View SLAM

Cross View SLAM This is the associated code and dataset repository for our paper I. D. Miller et al., "Any Way You Look at It: Semantic Crossview Loca

Ian D. Miller 99 Dec 09, 2022
Animation of solving the traveling salesman problem to optimality using mixed-integer programming and iteratively eliminating sub tours

tsp-streamlit Animation of solving the traveling salesman problem to optimality using mixed-integer programming and iteratively eliminating sub tours.

4 Nov 05, 2022
Code for "Learning the Best Pooling Strategy for Visual Semantic Embedding", CVPR 2021

Learning the Best Pooling Strategy for Visual Semantic Embedding Official PyTorch implementation of the paper Learning the Best Pooling Strategy for V

Jiacheng Chen 106 Jan 06, 2023
The code for the NeurIPS 2021 paper "A Unified View of cGANs with and without Classifiers".

Energy-based Conditional Generative Adversarial Network (ECGAN) This is the code for the NeurIPS 2021 paper "A Unified View of cGANs with and without

sianchen 22 May 28, 2022
Binary classification for arrythmia detection with ECG datasets.

HEART DISEASE AI DATATHON 2021 [Eng] / [Kor] #English This is an AI diagnosis modeling contest that uses the heart disease echocardiography and electr

HY_Kim 3 Jul 14, 2022
PyTorch implementation of the paper: Long-tail Learning via Logit Adjustment

logit-adj-pytorch PyTorch implementation of the paper: Long-tail Learning via Logit Adjustment This code implements the paper: Long-tail Learning via

Chamuditha Jayanga 53 Dec 23, 2022
Implementation of "Unsupervised Domain Adaptive 3D Detection with Multi-Level Consistency"

Unsupervised Domain Adaptive 3D Detection with Multi-Level Consistency (ICCV2021) Paper Link: https://arxiv.org/abs/2107.11355 This implementation bui

32 Nov 17, 2022
Pytorch implementation for "Large-Scale Long-Tailed Recognition in an Open World" (CVPR 2019 ORAL)

Large-Scale Long-Tailed Recognition in an Open World [Project] [Paper] [Blog] Overview Open Long-Tailed Recognition (OLTR) is the author's re-implemen

Zhongqi Miao 761 Dec 26, 2022