Code for the Paper: Alexandra Lindt and Emiel Hoogeboom.

Overview

Discrete Denoising Flows

This repository contains the code for the experiments presented in the paper Discrete Denoising Flows [1].

To give a short overview on the architecture of the implementation:

  • main.py: Starting point and configuration of experiments
  • training.py: Training logic
  • visualization_.py: Functions for plotting samples from trained model
  • model/categorical_prior.py: Prior distribution and splitpriors
  • model/model.py: Overall model object (Discrete Denoising Flow and prior)
  • model/flow.py: Discrete Denoising Flow object
  • model/flow_layers.py: Implementations of
    • Discrete denoising coupling layer (including the conditional permutation operation introduced in the paper)
    • Permutation layer
    • Squeeze layer
  • model/network.py: Implementation of DenseNet and simple MLP
  • data/*: Logic for loading Eight Gaussians, MNIST and Cityscapes datasets

Usage

For each of the following commands, the results are saved in the folder ./results.

8 Gaussians

To test Discrete Denoising Flows with limited computational resources, run the 8 Gaussian toy data experiment. It takes only a few minutes to execute on a 12 GB RAM laptop.

python main.py --dataset='8gaussians' --k_sort=91 --n_hidden_nn=256 --net_epochs=30 --prior_epochs=20

Binary MNIST

For the experiment on Binary MNIST run

python main.py --dataset='mnist' --k_sort=2 --n_hidden_nn=512 --densenet_depth=10 --net_epochs=100 --prior_epochs=30 

For running the experiment without splitpriors, set the flag --with_splitprior False.

Cityscapes

For this experiment, it is necessary to download the Cityscapes data set. For preprocessing, download from this repository the data_to_npy.py and cityscapes.py files that perform the conversion of the original data. This creates three .npy files that should be placed in ./data/cityscapes/preprocessed. Then run

python main.py --dataset='cityscapes' --k_sort=4 --n_hidden_nn=512 --densenet_depth=15 --net_epochs=100 --prior_epochs=30 

Again, for running the experiment without splitpriors, set the flag --with_splitprior False.

Acknowledgements

We gratefully acknowledge the financial support of Robert Bosch GmbH.

References

[1] Alexandra Lindt and Emiel Hoogeboom. "Discrete Denoising Flows." ICML Workshop on Invertible Neural Networks, Normalizing Flows, and Explicit Likelihood Models (2021).

Owner
Alexandra Lindt
Alexandra Lindt
Multi-Object Tracking in Satellite Videos with Graph-Based Multi-Task Modeling

TGraM Multi-Object Tracking in Satellite Videos with Graph-Based Multi-Task Modeling, Qibin He, Xian Sun, Zhiyuan Yan, Beibei Li, Kun Fu Abstract Rece

Qibin He 6 Nov 25, 2022
The repository forked from NVlabs uses our data. (Differentiable rasterization applied to 3D model simplification tasks)

nvdiffmodeling [origin_code] Differentiable rasterization applied to 3D model simplification tasks, as described in the paper: Appearance-Driven Autom

Qiujie (Jay) Dong 2 Oct 31, 2022
GrabGpu_py: a scripts for grab gpu when gpu is free

GrabGpu_py a scripts for grab gpu when gpu is free. WaitCondition: gpu_memory

tianyuluan 3 Jun 18, 2022
Confident Semantic Ranking Loss for Part Parsing

Confident Semantic Ranking Loss for Part Parsing

Jiachen Xu 5 Oct 22, 2022
Everything you need to know about NumPy( Creating Arrays, Indexing, Math,Statistics,Reshaping).

Everything you need to know about NumPy( Creating Arrays, Indexing, Math,Statistics,Reshaping).

1 Feb 14, 2022
Vehicle direction identification consists of three module detection , tracking and direction recognization.

Vehicle-direction-identification Vehicle direction identification consists of three module detection , tracking and direction recognization. Algorithm

5 Nov 15, 2022
CenterPoint 3D Object Detection and Tracking using center points in the bird-eye view.

CenterPoint 3D Object Detection and Tracking using center points in the bird-eye view. Center-based 3D Object Detection and Tracking, Tianwei Yin, Xin

Tianwei Yin 134 Dec 23, 2022
Code for the paper: Learning Adversarially Robust Representations via Worst-Case Mutual Information Maximization (https://arxiv.org/abs/2002.11798)

Representation Robustness Evaluations Our implementation is based on code from MadryLab's robustness package and Devon Hjelm's Deep InfoMax. For all t

Sicheng 19 Dec 07, 2022
Learning to See by Looking at Noise

Learning to See by Looking at Noise This is the official implementation of Learning to See by Looking at Noise. In this work, we investigate a suite o

Manel Baradad Jurjo 82 Dec 24, 2022
Keyword-BERT: Keyword-Attentive Deep Semantic Matching

project discription An implementation of the Keyword-BERT model mentioned in my paper Keyword-Attentive Deep Semantic Matching (Plz cite this github r

1 Nov 14, 2021
Auto Seg-Loss: Searching Metric Surrogates for Semantic Segmentation

Auto-Seg-Loss By Hao Li, Chenxin Tao, Xizhou Zhu, Xiaogang Wang, Gao Huang, Jifeng Dai This is the official implementation of the ICLR 2021 paper Auto

61 Dec 21, 2022
TextBPN Adaptive Boundary Proposal Network for Arbitrary Shape Text Detection

TextBPN Adaptive Boundary Proposal Network for Arbitrary Shape Text Detection; Accepted by ICCV2021. Note: The complete code (including training and t

S.X.Zhang 84 Dec 13, 2022
Learning hierarchical attention for weakly-supervised chest X-ray abnormality localization and diagnosis

Hierarchical Attention Mining (HAM) for weakly-supervised abnormality localization This is the official PyTorch implementation for the HAM method. Pap

Xi Ouyang 22 Jan 02, 2023
A tensorflow implementation of an HMM layer

tensorflow_hmm Tensorflow and numpy implementations of the HMM viterbi and forward/backward algorithms. See Keras example for an example of how to use

Zach Dwiel 283 Oct 19, 2022
Self-Supervised Deep Blind Video Super-Resolution

Self-Blind-VSR Paper | Discussion Self-Supervised Deep Blind Video Super-Resolution By Haoran Bai and Jinshan Pan Abstract Existing deep learning-base

Haoran Bai 35 Dec 09, 2022
Microsoft Cognitive Toolkit (CNTK), an open source deep-learning toolkit

CNTK Chat Windows build status Linux build status The Microsoft Cognitive Toolkit (https://cntk.ai) is a unified deep learning toolkit that describes

Microsoft 17.3k Dec 29, 2022
Keras implementation of AdaBound

AdaBound for Keras Keras port of AdaBound Optimizer for PyTorch, from the paper Adaptive Gradient Methods with Dynamic Bound of Learning Rate. Usage A

Somshubra Majumdar 132 Sep 23, 2022
A library for Deep Learning Implementations and utils

deeply A Deep Learning library Table of Contents Features Quick Start Usage License Features Python 2.7+ and Python 3.4+ compatible. Quick Start $ pip

Achilles Rasquinha 1 Dec 12, 2022
Prototypical Cross-Attention Networks for Multiple Object Tracking and Segmentation, NeurIPS 2021 Spotlight

PCAN for Multiple Object Tracking and Segmentation This is the offical implementation of paper PCAN for MOTS. We also present a trailer that consists

ETH VIS Group 328 Dec 29, 2022
This is a JAX implementation of Neural Radiance Fields for learning purposes.

learn-nerf This is a JAX implementation of Neural Radiance Fields for learning purposes. I've been curious about NeRF and its follow-up work for a whi

Alex Nichol 62 Dec 20, 2022