A Graph Neural Network Tool for Recovering Dense Sub-graphs in Random Dense Graphs.

Related tags

Deep LearningPYGON
Overview

PYGON

A Graph Neural Network Tool for Recovering Dense Sub-graphs in Random Dense Graphs.

Installation

This code requires to install and run the graph-measures package. Currently, we have a copy of this package that is ready to use (in "graph_calculations" directory), but it is possible to remove its content, download the graph_measures repository and follow the instructions below to be able to run this code. The conda environment for this project (part 2 in the instructions) is still required.
A detailed explanation for graph-measures package appears in a manual in graph-measures repository. Here we present short instructions:

  1. Download the graph-measures project into "graph_calculations/graph_measures".
  2. Create the anaconda environment for running this project by running conda env create -f env.yml in the terminal.
  3. Activate the new environment: conda activate boost.
  4. Move into the directory "graph_calculations/graph_measures/features_algorithms/accelerated_graph_features/src".
  5. Make the feature calculation files for motif calculations: make -f Makefile-gpu.
  6. Great! Now one should be able to run PYGON end-to-end. Remember to work in boost environment when using this code.

Note that this code was tested only on Unix machines with GPUs. Some feature calculations might not work in other machines.
Note also that the virtual environment we tried is anaconda-based.

How to Use

  • The main code directory is "model". The other directory includes the code for feature calculations and will include saved pickle files of graphs and their features.

  • For simply trying the PYGON model, one can run python pygon_main.py in a terminal. This will run a simple training of PYGON on G(500, 0.5, 20) graphs (which will be built and dumped in "graph_calculations/pkl"). One can change the parameters or graph specifications appearing there to try PYGON on graphs of other sizes, edge probabilities, planted sub-graph sizes, planted sub-graph types or model hyper-parameters.

  • More detailed performance tests can be found in performance_testing.py.

  • To run an NNI experiment on the performance of PYGON, move into "to_run_nni" and run the configuration the experiment as guided in NNI's documentation.

  • The existing algorithms to which we compared our performance, as well as a faster version of PYGON (without dumping or printing anything), can be found in other_algorithms.py.

  • The cleaning stage using the cleaning algorithm can be found in second_stage.py.

Owner
Yoram Louzoun's Lab
Yoram Louzoun's Lab
Bytedance Inc. 2.5k Jan 06, 2023
Crowd-sourced Annotation of Human Motion.

Motion Annotation Tool Live: https://motion-annotation.humanoids.kit.edu Paper: The KIT Motion-Language Dataset Installation Start by installing all P

Matthias Plappert 4 May 25, 2020
GeneDisco is a benchmark suite for evaluating active learning algorithms for experimental design in drug discovery.

GeneDisco is a benchmark suite for evaluating active learning algorithms for experimental design in drug discovery.

22 Dec 12, 2022
CONditionals for Ordinal Regression and classification in PyTorch

CONDOR pytorch implementation for ordinal regression with deep neural networks. Documentation: https://GarrettJenkinson.github.io/condor_pytorch About

7 Jul 25, 2022
Beginner-friendly repository for Hacktober Fest 2021. Start your contribution to open source through baby steps. đź’ś

Hacktober Fest 2021 🎉 Open source is changing the world – one contribution at a time! 🎉 This repository is made for beginners who are unfamiliar wit

Abhilash M Nair 32 Dec 11, 2022
In this project, we create and implement a deep learning library from scratch.

ARA In this project, we create and implement a deep learning library from scratch. Table of Contents Deep Leaning Library Table of Contents About The

22 Aug 23, 2022
Reinforcement-learning - Repository of the class assignment questions for the course on reinforcement learning

DSE 314/614: Reinforcement Learning This repository containing reinforcement lea

Manav Mishra 4 Apr 15, 2022
Pytorch implementation of Make-A-Scene: Scene-Based Text-to-Image Generation with Human Priors

Make-A-Scene - PyTorch Pytorch implementation (inofficial) of Make-A-Scene: Scene-Based Text-to-Image Generation with Human Priors (https://arxiv.org/

Casual GAN Papers 259 Dec 28, 2022
GB-CosFace: Rethinking Softmax-based Face Recognition from the Perspective of Open Set Classification

GB-CosFace: Rethinking Softmax-based Face Recognition from the Perspective of Open Set Classification This is the official pytorch implementation of t

Alibaba Cloud 5 Nov 14, 2022
Official repository for GCR rerank, a GCN-based reranking method for both image and video re-ID

Official repository for GCR rerank, a GCN-based reranking method for both image and video re-ID

53 Nov 22, 2022
VisionKG: Vision Knowledge Graph

VisionKG: Vision Knowledge Graph Official Repository of VisionKG by Anh Le-Tuan, Trung-Kien Tran, Manh Nguyen-Duc, Jicheng Yuan, Manfred Hauswirth and

Continuous Query Evaluation over Linked Stream (CQELS) 9 Jun 23, 2022
Julia package for multiway (inverse) covariance estimation.

TensorGraphicalModels TensorGraphicalModels.jl is a suite of Julia tools for estimating high-dimensional multiway (tensor-variate) covariance and inve

Wayne Wang 3 Sep 23, 2022
A library built upon PyTorch for building embeddings on discrete event sequences using self-supervision

pytorch-lifestream a library built upon PyTorch for building embeddings on discrete event sequences using self-supervision. It can process terabyte-si

Dmitri Babaev 103 Dec 17, 2022
DUE: End-to-End Document Understanding Benchmark

This is the repository that provide tools to download data, reproduce the baseline results and evaluation. What can you achieve with this guide Based

21 Dec 29, 2022
DP-CL(Continual Learning with Differential Privacy)

DP-CL(Continual Learning with Differential Privacy) This is the official implementation of the Continual Learning with Differential Privacy. If you us

Phung Lai 3 Nov 04, 2022
Source code for CVPR 2020 paper "Learning to Forget for Meta-Learning"

L2F - Learning to Forget for Meta-Learning Sungyong Baik, Seokil Hong, Kyoung Mu Lee Source code for CVPR 2020 paper "Learning to Forget for Meta-Lear

Sungyong Baik 29 May 22, 2022
Data augmentation for NLP, accepted at EMNLP 2021 Findings

AEDA: An Easier Data Augmentation Technique for Text Classification This is the code for the EMNLP 2021 paper AEDA: An Easier Data Augmentation Techni

Akbar Karimi 81 Dec 09, 2022
The code of “Similarity Reasoning and Filtration for Image-Text Matching” [AAAI2021]

SGRAF PyTorch implementation for AAAI2021 paper of “Similarity Reasoning and Filtration for Image-Text Matching”. It is built on top of the SCAN and C

Ronnie_IIAU 149 Dec 22, 2022
PyTorch implementation of Algorithm 1 of "On the Anatomy of MCMC-Based Maximum Likelihood Learning of Energy-Based Models"

Code for On the Anatomy of MCMC-Based Maximum Likelihood Learning of Energy-Based Models This repository will reproduce the main results from our pape

Mitch Hill 32 Nov 25, 2022
Segment axon and myelin from microscopy data using deep learning

Segment axon and myelin from microscopy data using deep learning. Written in Python. Using the TensorFlow framework. Based on a convolutional neural network architecture. Pixels are classified as eit

NeuroPoly 103 Nov 29, 2022