IsoGCN code for ICLR2021

Overview

IsoGCN

The official implementation of IsoGCN, presented in the ICLR2021 paper Isometric Transformation Invariant and Equivariant Graph Convolutional Networks [arXiv].

isogcn

Please cite us as:

@inproceedings{
horie2021isometric,
title={Isometric Transformation Invariant and Equivariant Graph Convolutional Networks},
author={Masanobu Horie and Naoki Morita and Toshiaki Hishinuma and Yu Ihara and Naoto Mitsume},
booktitle={International Conference on Learning Representations},
year={2021},
url={https://openreview.net/forum?id=FX0vR39SJ5q}
}

General notice

If some of the following steps not working, please modify User settings section in the Makefile to fit with your environment.

Installation

You can either locally install locally or use Docker image. However, to generate an anisotropic nonlinear heat equation dataset, we recommend using Docker.

Local install

We are using poetry, thus first install it following the instruction on https://python-poetry.org/docs/ . Then, update the Makefile to PYTHON ?= 'poetry run python3' or explicitly specify the PYTHON environment variable on the execution of the make command.

For GPU environment,

PYTHON='poetry run python3' make poetry
poetry install
PYTHON='poetry run python3' make install_pyg_gpu

For CPU environment,

PYTHON='poetry run python3' make poetry
poetry install
PYTHON='poetry run python3' make install_pyg_cpu

, and set GPU_ID = -1 in the Makefile.

Also, optionally, please install FrontISTR and gmsh to generate an anisotropic nonlinear heat equation dataset (which are installed in the Docker image).

Docker image

Please download the docker image via https://savanna.ritc.jp/~horiem/isogcn_iclr2021/images/isogcn.tar, then place the image in the images directory. After that, plsease make in to login the docker befor perfroming all the following processes.

Differential operator dataset

Data generation

make differential_data

Training IsoGCN

make scalar2grad  # Scalar to gradient task
make scalar2grad  ADJ=5  # Scalar to gradient task with # hops = 5
make scalar2hessian  # Scalar to Hessian task
make grad2laplacian  # Gradient to Laplacian task
make grad2hessian  # Gradient to Hessian task

Training baseline models

make scalar2grad_baseline BASELINE_NAME=gcn  # BASELINE_NAME=[cluster_gcn, gcn, gcnii, gin, sgcn]

Similarly, one can perform baseline model trainings for other tasks.

Anisotropic nonlinear heat equation dataset

Run whole process with small data to check the process (Optional)

It generates a small dataset to simulate the whole process of data generation, preprocessing, training, and inference. This process requires either FrontISTR installed locally or Docker image.

make small_heat_nl_tensor_pipeline

Dataset download

The dataset containing finite element analysis results is generated from the ABC dataset using gmsh for meshing and FrontISTR for analysis.

Please download the dataset you need. (Note: To perform only training, you need only 'preprocessed' data.) The dataset can be downloaded via:

After download finished, please merge the split archives with:

cat train_50.tar.gz.parta* > train.tar.gz

, extract them with tar xvf *.tar.gz, then place them in the corresponding data/heat_nl_tensor/(raw|interim|preprocessed) directory.

Training IsoGCN

make heat_nl_tensor

Training baseline models

make heat_nl_tensor_baseline BASELINE_NAME=gcn  # BASELINE_NAME=[cluster_gcn, gcn, gcnii, gin, sgcn]

IsoGCN core implementation

The core implementation of the IsoGCN layer is separated in the library SiML and can be found here. Also, the code to generate IsoAMs is separated in the library Femio and can be found here.

License

Apache License 2.0.

Owner
horiem
A researcher, an engineer, and a Ph.D. student in machine learning + physical simulation.
horiem
Open-source codebase for EfficientZero, from "Mastering Atari Games with Limited Data" at NeurIPS 2021.

EfficientZero (NeurIPS 2021) Open-source codebase for EfficientZero, from "Mastering Atari Games with Limited Data" at NeurIPS 2021. Thank you for you

Weirui Ye 671 Jan 03, 2023
Final term project for Bayesian Machine Learning Lecture (XAI-623)

Mixquality_AL Final Term Project For Bayesian Machine Learning Lecture (XAI-623) Youtube Link The presentation is given in YoutubeLink Problem Formula

JeongEun Park 3 Jan 18, 2022
Pytorch implemenation of Stochastic Multi-Label Image-to-image Translation (SMIT)

SMIT: Stochastic Multi-Label Image-to-image Translation This repository provides a PyTorch implementation of SMIT. SMIT can stochastically translate a

Biomedical Computer Vision Group @ Uniandes 37 Mar 01, 2022
Official code for paper "Optimization for Oriented Object Detection via Representation Invariance Loss".

Optimization for Oriented Object Detection via Representation Invariance Loss By Qi Ming, Zhiqiang Zhou, Lingjuan Miao, Xue Yang, and Yunpeng Dong. Th

ming71 56 Nov 28, 2022
Computer Vision application in the web

Computer Vision application in the web Preview Usage Clone this repo git clone https://github.com/amineHY/WebApp-Computer-Vision-streamlit.git cd Web

Amine Hadj-Youcef. PhD 35 Dec 06, 2022
A working implementation of the Categorical DQN (Distributional RL).

Categorical DQN. Implementation of the Categorical DQN as described in A distributional Perspective on Reinforcement Learning. Thanks to @tudor-berari

Florin Gogianu 98 Sep 20, 2022
K-Nearest Neighbor in Pytorch

Pytorch KNN CUDA 2019/11/02 This repository will no longer be maintained as pytorch supports sort() and kthvalue on tensors. git clone https://github.

Chris Choy 65 Dec 01, 2022
Diverse Branch Block: Building a Convolution as an Inception-like Unit

Diverse Branch Block: Building a Convolution as an Inception-like Unit (PyTorch) (CVPR-2021) DBB is a powerful ConvNet building block to replace regul

253 Dec 24, 2022
Your interactive network visualizing dashboard

Your interactive network visualizing dashboard Documentation: Here What is Jaal Jaal is a python based interactive network visualizing tool built usin

Mohit 177 Jan 04, 2023
PyTorch version of Stable Baselines, reliable implementations of reinforcement learning algorithms.

PyTorch version of Stable Baselines, reliable implementations of reinforcement learning algorithms.

DLR-RM 4.7k Jan 01, 2023
Baselines for TrajNet++

TrajNet++ : The Trajectory Forecasting Framework PyTorch implementation of Human Trajectory Forecasting in Crowds: A Deep Learning Perspective TrajNet

VITA lab at EPFL 183 Jan 05, 2023
Official PyTorch Implementation of paper EAN: Event Adaptive Network for Efficient Action Recognition

Official PyTorch Implementation of paper EAN: Event Adaptive Network for Efficient Action Recognition

TianYuan 27 Nov 07, 2022
Official NumPy Implementation of Deep Networks from the Principle of Rate Reduction (2021)

Deep Networks from the Principle of Rate Reduction This repository is the official NumPy implementation of the paper Deep Networks from the Principle

Ryan Chan 49 Dec 16, 2022
Flexible-Modal Face Anti-Spoofing: A Benchmark

Flexible-Modal FAS This is the official repository of "Flexible-Modal Face Anti-

Zitong Yu 22 Nov 10, 2022
Self-Supervised Learning of Event-based Optical Flow with Spiking Neural Networks

Self-Supervised Learning of Event-based Optical Flow with Spiking Neural Networks Work accepted at NeurIPS'21 [paper, video]. If you use this code in

TU Delft 43 Dec 07, 2022
Pathdreamer: A World Model for Indoor Navigation

Pathdreamer: A World Model for Indoor Navigation This repository hosts the open source code for Pathdreamer, to be presented at ICCV 2021. Paper | Pro

Google Research 122 Jan 04, 2023
This repository is for the preprint "A generative nonparametric Bayesian model for whole genomes"

BEAR Overview This repository contains code associated with the preprint A generative nonparametric Bayesian model for whole genomes (2021), which pro

Debora Marks Lab 10 Sep 18, 2022
Quick program made to generate alpha and delta tables for Hidden Markov Models

HMM_Calc Functions for generating Alpha and Delta tables from a Hidden Markov Model. Parameters: a: Matrix of transition probabilities. a[i][j] = a_{i

Adem Odza 1 Dec 04, 2021
Implementation of ConvMixer in TensorFlow and Keras

ConvMixer ConvMixer, an extremely simple model that is similar in spirit to the ViT and the even-more-basic MLP-Mixer in that it operates directly on

Sayan Nath 8 Oct 03, 2022
Guiding evolutionary strategies by (inaccurate) differentiable robot simulators @ NeurIPS, 4th Robot Learning Workshop

Guiding Evolutionary Strategies by Differentiable Robot Simulators In recent years, Evolutionary Strategies were actively explored in robotic tasks fo

Vladislav Kurenkov 4 Dec 14, 2021