Code for paper "Multi-level Disentanglement Graph Neural Network"

Overview

Multi-level Disentanglement Graph Neural Network (MD-GNN)

This is a PyTorch implementation of the MD-GNN, and the code includes the following modules:

  • Datasets (Cora, Citeseer, Pubmed, Synthetic, and ZINC)

  • Training paradigm for node classification, graph classification, and graph regression tasks

  • Visualization

  • Evaluation metrics

Main Requirements

  • dgl==0.4.3.post2
  • networkx==2.4
  • numpy==1.18.1
  • ogb==1.1.1
  • scikit-learn==0.22.2.post1
  • scipy==1.4.1
  • torch==1.5.0

Description

  • train.py

    • main() -- Train a new model for node classification task on the Cora, Citeseer, and Pubmed datasets
    • evaluate() -- Test the learned model for node classification task on the Cora, Citeseer, and Pubmed datasets
    • main_synthetic() -- Train a new model for graph classification task on the Synthetic dataset
    • evaluate_synthetic() -- Test the learned model for graph classification task on the Synthetic dataset
    • main_zinc() -- Train a new model for graph regression task on the ZINC datasets
    • evaluate_zinc() -- Test the learned model for graph regression task on the ZINC datasets
  • dataset.py

    • load_data() -- Load data of selected dataset
  • MDGNN.py

    • MDGNN() -- model and loss
  • utils.py

    • evaluate_att() -- Evaluate attribute-level disentanglement with the visualization of relation-related attributes
    • evaluate_corr() -- Evaluate node-level disentanglement with the correlation analysis of latent features
    • evaluate_graph() -- Evaluate graph-level disentanglement with the visualization of disentangled relation graphs

Running the code

  1. Install the required dependency packages and unzip files in the data folder.

  2. We use DGL to implement all the GNN models on three citation datasets (Cora, Citeseer, and Pubmed). In order to evaluate the model with different splitting strategy (fewer and harder label rates), you need to replace the following file with the citation_graph.py provided.

dgl/data/citation_graph.py

  1. To get the results on a specific dataset, run with proper hyperparameters
python train.py --dataset data_name

where the data_name is one of the five datasets (cora, citeseer, pubmed, synthetic, and zinc). The model as well as the training log will be saved to the corresponding dir in ./log for evaluation.

  1. The evaluation the performance of three-level disentanglement performance, run
python utils.py

License

MD-GNN is released under the MIT license.

Owner
Lirong Wu
Ph.D. student on Graph.
Lirong Wu
Technical experimentations to beat the stock market using deep learning :chart_with_upwards_trend:

DeepStock Technical experimentations to beat the stock market using deep learning. Experimentations Deep Learning Stock Prediction with Daily News Hea

Keon 449 Dec 29, 2022
The goal of the exercises below is to evaluate the candidate knowledge and problem solving expertise regarding the main development focuses for the iFood ML Platform team: MLOps and Feature Store development.

The goal of the exercises below is to evaluate the candidate knowledge and problem solving expertise regarding the main development focuses for the iFood ML Platform team: MLOps and Feature Store dev

George Rocha 0 Feb 03, 2022
The Official PyTorch Implementation of "VAEBM: A Symbiosis between Variational Autoencoders and Energy-based Models" (ICLR 2021 spotlight paper)

Official PyTorch implementation of "VAEBM: A Symbiosis between Variational Autoencoders and Energy-based Models" (ICLR 2021 Spotlight Paper) Zhisheng

NVIDIA Research Projects 45 Dec 26, 2022
The implementation of the paper "A Deep Feature Aggregation Network for Accurate Indoor Camera Localization".

A Deep Feature Aggregation Network for Accurate Indoor Camera Localization This is the PyTorch implementation of our paper "A Deep Feature Aggregation

9 Dec 09, 2022
Official PyTorch implementation of "ArtFlow: Unbiased Image Style Transfer via Reversible Neural Flows"

ArtFlow Official PyTorch implementation of the paper: ArtFlow: Unbiased Image Style Transfer via Reversible Neural Flows Jie An*, Siyu Huang*, Yibing

123 Dec 27, 2022
Keras implementation of the GNM model in paper ’Graph-Based Semi-Supervised Learning with Nonignorable Nonresponses‘

Graph-based joint model with Nonignorable Missingness (GNM) This is a Keras implementation of the GNM model in paper ’Graph-Based Semi-Supervised Lear

Fan Zhou 2 Apr 17, 2022
[CVPR 2021] Semi-Supervised Semantic Segmentation with Cross Pseudo Supervision

TorchSemiSeg [CVPR 2021] Semi-Supervised Semantic Segmentation with Cross Pseudo Supervision by Xiaokang Chen1, Yuhui Yuan2, Gang Zeng1, Jingdong Wang

Chen XiaoKang 387 Jan 08, 2023
Second-Order Neural ODE Optimizer, NeurIPS 2021 spotlight

Second-order Neural ODE Optimizer (NeurIPS 2021 Spotlight) [arXiv] ✔️ faster convergence in wall-clock time | ✔️ O(1) memory cost | ✔️ better test-tim

Guan-Horng Liu 39 Oct 22, 2022
Automatic Image Background Subtraction

Automatic Image Background Subtraction This repo contains set of scripts for automatic one-shot image background subtraction task using the following

Oleg Sémery 6 Dec 05, 2022
Unofficial PyTorch implementation of Masked Autoencoders Are Scalable Vision Learners

Unofficial PyTorch implementation of Masked Autoencoders Are Scalable Vision Learners This repository is built upon BEiT, thanks very much! Now, we on

Zhiliang Peng 2.3k Jan 04, 2023
Aws-machine-learning-university-accelerated-tab - Machine Learning University: Accelerated Tabular Data Class

Machine Learning University: Accelerated Tabular Data Class This repository contains slides, notebooks, and datasets for the Machine Learning Universi

AWS Samples 916 Dec 23, 2022
CoTr: Efficiently Bridging CNN and Transformer for 3D Medical Image Segmentation

CoTr: Efficient 3D Medical Image Segmentation by bridging CNN and Transformer This is the official pytorch implementation of the CoTr: Paper: CoTr: Ef

218 Dec 25, 2022
Official code for "Focal Self-attention for Local-Global Interactions in Vision Transformers"

Focal Transformer This is the official implementation of our Focal Transformer -- "Focal Self-attention for Local-Global Interactions in Vision Transf

Microsoft 486 Dec 20, 2022
Uni-Fold: Training your own deep protein-folding models.

Uni-Fold: Training your own deep protein-folding models. This package provides and implementation of a trainable, Transformer-based deep protein foldi

DeepModeling 88 Jan 03, 2023
Pytorch implementation of paper Semi-supervised Knowledge Transfer for Deep Learning from Private Training Data

Pytorch implementation of paper Semi-supervised Knowledge Transfer for Deep Learning from Private Training Data

Hrishikesh Kamath 31 Nov 20, 2022
Bot developed in Python that automates races in pegaxy.

español | português About it: This is a fork from pega-racing-bot. This bot, developed in Python, is to automate races in pegaxy. The game developers

4 Apr 08, 2022
Implementation for HFGI: High-Fidelity GAN Inversion for Image Attribute Editing

HFGI: High-Fidelity GAN Inversion for Image Attribute Editing High-Fidelity GAN Inversion for Image Attribute Editing Update: We released the inferenc

Tengfei Wang 371 Dec 30, 2022
This is a file about Unet implemented in Pytorch

Unet this is an implemetion of Unet in Pytorch and it's architecture is as follows which is the same with paper of Unet component of Unet Convolution

Dragon 1 Dec 03, 2021
Pytorch implementation of Distributed Proximal Policy Optimization: https://arxiv.org/abs/1707.02286

Pytorch-DPPO Pytorch implementation of Distributed Proximal Policy Optimization: https://arxiv.org/abs/1707.02286 Using PPO with clip loss (from https

Alexis David Jacq 163 Dec 26, 2022
Empowering journalists and whistleblowers

Onymochat Empowering journalists and whistleblowers Onymochat is an end-to-end encrypted, decentralized, anonymous chat application. You can also host

Samrat Dutta 19 Sep 02, 2022