AdaDM: Enabling Normalization for Image Super-Resolution

Related tags

Deep LearningAdaDM
Overview

AdaDM

AdaDM: Enabling Normalization for Image Super-Resolution.

You can apply BN, LN or GN in SR networks with our AdaDM. Pretrained models (EDSR*/RDN*/NLSN*) can be downloaded from Google Drive or BaiduYun. The password for BaiduYun is kymj.

📢 If you use BasicSR framework, you need to turn off the Exponential Moving Average (EMA) option when applying BN in the generator network (e.g., RRDBNet). You can disable EMA by setting ema_decay=0 in corresponding .yml configuration file.

Model Scale File name (.pt) Urban100 Manga109
EDSR 2 32.93 39.10
3 28.80 34.17
4 26.64 31.02
EDSR* 2 EDSR_AdaDM_DIV2K_X2 33.12 39.31
3 EDSR_AdaDM_DIV2K_X3 29.02 34.48
4 EDSR_AdaDM_DIV2K_X4 26.83 31.24
RDN 2 32.89 39.18
3 28.80 34.13
4 26.61 31.00
RDN* 2 RDN_AdaDM_DIV2K_X2 33.03 39.18
3 RDN_AdaDM_DIV2K_X3 28.95 34.29
4 RDN_AdaDM_DIV2K_X4 26.72 31.18
NLSN 2 33.42 39.59
3 29.25 34.57
4 26.96 31.27
NLSN* 2 NLSN_AdaDM_DIV2K_X2 33.59 39.67
3 NLSN_AdaDM_DIV2K_X3 29.53 34.95
4 NLSN_AdaDM_DIV2K_X4 27.24 31.73

Preparation

Please refer to EDSR for instructions on dataset download and software installation, then clone our repository as follows:

git clone https://github.com/njulj/AdaDM.git

Training

cd AdaDM/src
bash train.sh

Example training command in train.sh looks like:

CUDA_VISIBLE_DEVICES=$GPU_ID python3 main.py --template EDSR_paper --scale 2\
        --n_GPUs 1 --batch_size 16 --patch_size 96 --rgb_range 255 --res_scale 0.1\
        --save EDSR_AdaDM_Test_DIV2K_X2 --dir_data ../dataset --data_test Urban100\
        --epochs 1000 --decay 200-400-600-800 --lr 1e-4 --save_models --save_results 

Here, $GPU_ID specifies the GPU id used for training. EDSR_AdaDM_Test_DIV2K_X2 is the directory where all files are saved during training. --dir_data specifies the root directory for all datasets, you should place the DIV2K and benchmark (e.g., Urban100) datasets under this directory.

Testing

cd AdaDM/src
bash test.sh

Example testing command in test.sh looks like:

CUDA_VISIBLE_DEVICES=$GPU_ID python3 main.py --template EDSR_paper --scale $SCALE\
        --pre_train ../experiment/test/model/EDSR_AdaDM_DIV2K_X$SCALE.pt\
        --dir_data ../dataset --n_GPUs 1 --test_only --data_test $TEST_DATASET

Here, $GPU_ID specifies the GPU id used for testing. $SCALE indicates the upscaling factor (e.g., 2, 3, 4). --pre_train specifies the path of saved checkpoints. $TEST_DATASET indicates the dataset to be tested.

Acknowledgement

This repository is built on EDSR and NLSN. We thank the authors for sharing their codes.

An implementation of the efficient attention module.

Efficient Attention An implementation of the efficient attention module. Description Efficient attention is an attention mechanism that substantially

Shen Zhuoran 194 Dec 15, 2022
A Python package for performing pore network modeling of porous media

Overview of OpenPNM OpenPNM is a comprehensive framework for performing pore network simulations of porous materials. More Information For more detail

PMEAL 336 Dec 30, 2022
Cascaded Pyramid Network (CPN) based on Keras (Tensorflow backend)

ML2 Takehome Project Reimplementing the paper: Cascaded Pyramid Network for Multi-Person Pose Estimation Dataset The model uses the COCO dataset which

Vo Van Tu 1 Nov 22, 2021
An Evaluation of Generative Adversarial Networks for Collaborative Filtering.

An Evaluation of Generative Adversarial Networks for Collaborative Filtering. This repository was developed by Fernando B. Pérez Maurera. Fernando is

Fernando Benjamín PÉREZ MAURERA 0 Jan 19, 2022
Technical Analysis Indicators - Pandas TA is an easy to use Python 3 Pandas Extension with 130+ Indicators

Pandas TA - A Technical Analysis Library in Python 3 Pandas Technical Analysis (Pandas TA) is an easy to use library that leverages the Pandas package

Kevin Johnson 3.2k Jan 09, 2023
Real-Time-Student-Attendence-System - Real Time Student Attendence System

Real-Time-Student-Attendence-System The Student Attendance Management System Pro

Rounak Das 1 Feb 15, 2022
The code for paper Efficiently Solve the Max-cut Problem via a Quantum Qubit Rotation Algorithm

Quantum Qubit Rotation Algorithm Single qubit rotation gates $$ U(\Theta)=\bigotimes_{i=1}^n R_x (\phi_i) $$ QQRA for the max-cut problem This code wa

SheffieldWang 0 Oct 18, 2021
Official implementation of "GS-WGAN: A Gradient-Sanitized Approach for Learning Differentially Private Generators" (NeurIPS 2020)

GS-WGAN This repository contains the implementation for GS-WGAN: A Gradient-Sanitized Approach for Learning Differentially Private Generators (NeurIPS

46 Nov 09, 2022
Neural Motion Learner With Python

Neural Motion Learner Introduction This work is to extract skeletal structure from volumetric observations and to learn motion dynamics from the detec

Jinseok Bae 14 Nov 28, 2022
Discriminative Condition-Aware PLDA

DCA-PLDA This repository implements the Discriminative Condition-Aware Backend described in the paper: L. Ferrer, M. McLaren, and N. Brümmer, "A Speak

Luciana Ferrer 31 Aug 05, 2022
HiFi++: a Unified Framework for Neural Vocoding, Bandwidth Extension and Speech Enhancement

HiFi++ : a Unified Framework for Neural Vocoding, Bandwidth Extension and Speech Enhancement This is the unofficial implementation of Vocoder part of

Rishikesh (ऋषिकेश) 118 Dec 29, 2022
Deep Learning Specialization by Andrew Ng, deeplearning.ai.

Deep Learning Specialization on Coursera Master Deep Learning, and Break into AI This is my personal projects for the course. The course covers deep l

Engen 1.5k Jan 07, 2023
This is the repo for the paper `SumGNN: Multi-typed Drug Interaction Prediction via Efficient Knowledge Graph Summarization'. (published in Bioinformatics'21)

SumGNN: Multi-typed Drug Interaction Prediction via Efficient Knowledge Graph Summarization This is the code for our paper ``SumGNN: Multi-typed Drug

Yue Yu 58 Dec 21, 2022
Self-supervised spatio-spectro-temporal represenation learning for EEG analysis

EEG-Oriented Self-Supervised Learning and Cluster-Aware Adaptation This repository provides a tensorflow implementation of a submitted paper: EEG-Orie

Wonjun Ko 4 Jun 09, 2022
Official code for "Focal Self-attention for Local-Global Interactions in Vision Transformers"

Focal Transformer This is the official implementation of our Focal Transformer -- "Focal Self-attention for Local-Global Interactions in Vision Transf

Microsoft 486 Dec 20, 2022
Python implementation of the multistate Bennett acceptance ratio (MBAR)

pymbar Python implementation of the multistate Bennett acceptance ratio (MBAR) method for estimating expectations and free energy differences from equ

Chodera lab // Memorial Sloan Kettering Cancer Center 169 Dec 02, 2022
CMT: Convolutional Neural Networks Meet Vision Transformers

CMT: Convolutional Neural Networks Meet Vision Transformers [arxiv] 1. Introduction This repo is the CMT model which impelement with pytorch, no refer

FlyEgle 83 Dec 30, 2022
Public scripts, services, and configuration for running a smart home K3S network cluster

makerhouse_network Public scripts, services, and configuration for running MakerHouse's home network. This network supports: TODO features here For mo

Scott Martin 1 Jan 15, 2022
Application of the L2HMC algorithm to simulations in lattice QCD.

l2hmc-qcd 📊 Slides Recent talk on Training Topological Samplers for Lattice Gauge Theory from the Machine Learning for High Energy Physics, on and of

Sam Foreman 37 Dec 14, 2022