GraphRNN: Generating Realistic Graphs with Deep Auto-regressive Models

Overview

GraphRNN: Generating Realistic Graphs with Deep Auto-regressive Model

This repository is the official PyTorch implementation of GraphRNN, a graph generative model using auto-regressive model.

Jiaxuan You*, Rex Ying*, Xiang Ren, William L. Hamilton, Jure Leskovec, GraphRNN: Generating Realistic Graphs with Deep Auto-regressive Model (ICML 2018)

Installation

Install PyTorch following the instuctions on the official website. The code has been tested over PyTorch 0.2.0 and 0.4.0 versions.

conda install pytorch torchvision cuda90 -c pytorch

Then install the other dependencies.

pip install -r requirements.txt

Test run

python main.py

Code description

For the GraphRNN model: main.py is the main executable file, and specific arguments are set in args.py. train.py includes training iterations and calls model.py and data.py create_graphs.py is where we prepare target graph datasets.

For baseline models:

  • B-A and E-R models are implemented in baselines/baseline_simple.py.
  • Kronecker graph model is implemented in the SNAP software, which can be found in https://github.com/snap-stanford/snap/tree/master/examples/krongen (for generating Kronecker graphs), and https://github.com/snap-stanford/snap/tree/master/examples/kronfit (for learning parameters for the model).
  • MMSB is implemented using the EDWARD library (http://edwardlib.org/), and is located in baselines.
  • We implemented the DeepGMG model based on the instructions of their paper in main_DeepGMG.py.
  • We implemented the GraphVAE model based on the instructions of their paper in baselines/graphvae.

Parameter setting: To adjust the hyper-parameter and input arguments to the model, modify the fields of args.py accordingly. For example, args.cuda controls which GPU is used to train the model, and args.graph_type specifies which dataset is used to train the generative model. See the documentation in args.py for more detailed descriptions of all fields.

Outputs

There are several different types of outputs, each saved into a different directory under a path prefix. The path prefix is set at args.dir_input. Suppose that this field is set to ./:

  • ./graphs contains the pickle files of training, test and generated graphs. Each contains a list of networkx object.
  • ./eval_results contains the evaluation of MMD scores in txt format.
  • ./model_save stores the model checkpoints
  • ./nll saves the log-likelihood for generated graphs as sequences.
  • ./figures is used to save visualizations (see Visualization of graphs section).

Evaluation

The evaluation is done in evaluate.py, where user can choose which settings to evaluate. To evaluate how close the generated graphs are to the ground truth set, we use MMD (maximum mean discrepancy) to calculate the divergence between two sets of distributions related to the ground truth and generated graphs. Three types of distributions are chosen: degree distribution, clustering coefficient distribution. Both of which are implemented in eval/stats.py, using multiprocessing python module. One can easily extend the evaluation to compute MMD for other distribution of graphs.

We also compute the orbit counts for each graph, represented as a high-dimensional data point. We then compute the MMD between the two sets of sampled points using ORCA (see http://www.biolab.si/supp/orca/orca.html) at eval/orca. One first needs to compile ORCA by

g++ -O2 -std=c++11 -o orca orca.cpp` 

in directory eval/orca. (the binary file already in repo works in Ubuntu).

To evaluate, run

python evaluate.py

Arguments specific to evaluation is specified in class evaluate.Args_evaluate. Note that the field Args_evaluate.dataset_name_all must only contain datasets that are already trained, by setting args.graph_type to each of the datasets and running python main.py.

Visualization of graphs

The training, testing and generated graphs are saved at 'graphs/'. One can visualize the generated graph using the function utils.load_graph_list, which loads the list of graphs from the pickle file, and util.draw_graph_list, which plots the graph using networkx.

Misc

Jesse Bettencourt and Harris Chan have made a great slide introducing GraphRNN in Prof. David Duvenaud’s seminar course Learning Discrete Latent Structure.

Owner
Jiaxuan
Jiaxuan
Scikit-learn compatible estimation of general graphical models

skggm : Gaussian graphical models using the scikit-learn API In the last decade, learning networks that encode conditional independence relationships

213 Jan 02, 2023
Pretty Tensor - Fluent Neural Networks in TensorFlow

Pretty Tensor provides a high level builder API for TensorFlow. It provides thin wrappers on Tensors so that you can easily build multi-layer neural networks.

Google 1.2k Dec 29, 2022
Ian Covert 130 Jan 01, 2023
HIVE: Evaluating the Human Interpretability of Visual Explanations

HIVE: Evaluating the Human Interpretability of Visual Explanations Project Page | Paper This repo provides the code for HIVE, a human evaluation frame

Princeton Visual AI Lab 16 Dec 13, 2022
LSTM Neural Networks for Spectroscopic Studies of Type Ia Supernovae

Package Description The difficulties in acquiring spectroscopic data have been a major challenge for supernova surveys. snlstm is developed to provide

7 Oct 11, 2022
E2VID_ROS - E2VID_ROS: E2VID to a real-time system

E2VID_ROS Introduce We extend E2VID to a real-time system. Because Python ROS ca

Robin Shaun 7 Apr 17, 2022
A tool for calculating distortion parameters in coordination complexes.

OctaDist Octahedral distortion calculator: A tool for calculating distortion parameters in coordination complexes. https://octadist.github.io/ Registe

OctaDist 12 Oct 04, 2022
Cascaded Deep Video Deblurring Using Temporal Sharpness Prior and Non-local Spatial-Temporal Similarity

This repository is the official PyTorch implementation of Cascaded Deep Video Deblurring Using Temporal Sharpness Prior and Non-local Spatial-Temporal Similarity

hippopmonkey 4 Dec 11, 2022
PyTorch implementation of "MLP-Mixer: An all-MLP Architecture for Vision" Tolstikhin et al. (2021)

mlp-mixer-pytorch PyTorch implementation of "MLP-Mixer: An all-MLP Architecture for Vision" Tolstikhin et al. (2021) Usage import torch from mlp_mixer

isaac 27 Jul 09, 2022
MacroTools provides a library of tools for working with Julia code and expressions.

MacroTools.jl MacroTools provides a library of tools for working with Julia code and expressions. This includes a powerful template-matching system an

FluxML 278 Dec 11, 2022
Code base for NeurIPS 2021 publication titled Kernel Functional Optimisation (KFO)

KernelFunctionalOptimisation Code base for NeurIPS 2021 publication titled Kernel Functional Optimisation (KFO) We have conducted all our experiments

2 Jun 29, 2022
Current state of supervised and unsupervised depth completion methods

Awesome Depth Completion Table of Contents About Sparse-to-Dense Depth Completion Current State of Depth Completion Unsupervised VOID Benchmark Superv

224 Dec 28, 2022
Single Red Blood Cell Hydrodynamic Traps Via the Generative Design

Rbc-traps-generative-design - The generative design for single red clood cell hydrodynamic traps using GEFEST framework

Natural Systems Simulation Lab 4 Jun 16, 2022
General Virtual Sketching Framework for Vector Line Art (SIGGRAPH 2021)

General Virtual Sketching Framework for Vector Line Art - SIGGRAPH 2021 Paper | Project Page Outline Dependencies Testing with Trained Weights Trainin

Haoran MO 118 Dec 27, 2022
Optimus: the first large-scale pre-trained VAE language model

Optimus: the first pre-trained Big VAE language model This repository contains source code necessary to reproduce the results presented in the EMNLP 2

314 Dec 19, 2022
My implementation of Fully Convolutional Neural Networks in Keras

Keras-FCN This repository contains my implementation of Fully Convolutional Networks in Keras (Tensorflow backend). Currently, semantic segmentation c

The Duy Nguyen 15 Jan 13, 2020
Probabilistic Cross-Modal Embedding (PCME) CVPR 2021

Probabilistic Cross-Modal Embedding (PCME) CVPR 2021 Official Pytorch implementation of PCME | Paper Sanghyuk Chun1 Seong Joon Oh1 Rafael Sampaio de R

NAVER AI 87 Dec 21, 2022
Pytorch Implementation of Auto-Compressing Subset Pruning for Semantic Image Segmentation

Pytorch Implementation of Auto-Compressing Subset Pruning for Semantic Image Segmentation Introduction ACoSP is an online pruning algorithm that compr

Merantix 8 Dec 07, 2022
Do you like Quick, Draw? Well what if you could train/predict doodles drawn inside Streamlit? Also draws lines, circles and boxes over background images for annotation.

Streamlit - Drawable Canvas Streamlit component which provides a sketching canvas using Fabric.js. Features Draw freely, lines, circles, boxes and pol

Fanilo Andrianasolo 325 Dec 28, 2022
Numba-accelerated Pythonic implementation of MPDATA with examples in Python, Julia and Matlab

PyMPDATA PyMPDATA is a high-performance Numba-accelerated Pythonic implementation of the MPDATA algorithm of Smolarkiewicz et al. used in geophysical

Atmospheric Cloud Simulation Group @ Jagiellonian University 15 Nov 23, 2022