Multi-Glimpse Network With Python

Related tags

Deep LearningMGNet
Overview

Multi-Glimpse Network

Our code requires Python ≥ 3.8

Installation

For example, venv + pip:

$ python3 -m venv env
$ source env/bin/activate
(env) $ python3 -m pip install -r requirements.txt

Evaluation

Accuracy on clean images

  1. Create ImageNet100 from ImageNet (using symbolic links).
$ python3 tools/create_imagenet100.py tools/imagenet100.txt \
    /path/to/ImageNet /path/to/ImageNet100
  1. Download checkpoints from Google Drive.

  2. Test accuracy.

$ export dataset="--train_dir /path/to/ImageNet100/train \
    --val_dir /path/to/ImageNet100/val \
    --dataset imagenet --num_class 100"
# Baseline
$ python3 main.py $dataset --test --n_iter 1 --scale 1.0  --model resnet18 \
    --checkpoint resnet18_baseline
# Ours
$ python3 main.py $dataset --test --n_iter 4 --scale 2.33 --model resnet18 \
    --checkpoint resnet18_ours --alpha 0.6 --s 0.02

Add the flag --flop_count to count the approximate FLOPs for the inference of an image. (using fvcore)

Accuracy on adversarial attacks (PGD)

  1. Test adversarial accuracy.
# Baseline
$ python3 main.py $dataset --test --n_iter 1 --scale 1.0  --adv --step_k 10 \
    --model resnet18 --checkpoint resnet18_baseline
# Ours
$ python3 main.py $dataset --test --n_iter 4 --scale 2.33 --adv --step_k 10 \
    --model resnet18 --checkpoint resnet18_ours --alpha 0.6 --s 0.02

Accuracy on common corruptions

  1. Create ImageNet100-C from ImageNet-C (using symbolic links).
$ python3 tools/create_imagenet100c.py  \
    tools/imagenet100.txt  /path/to/ImageNet-C/ /path/to/ImageNet100-C/
  1. Test for a single corruption.
$ export dataset="--train_dir /path/to/ImageNet100/train \
    --val_dir /path/to/ImageNet100-C/pixelate/5 \
    --dataset imagenet --num_class 100"
# Baseline
$ python3 main.py $dataset --test --n_iter 1 --scale 1.0  --model resnet18 \
    --checkpoint resnet18_baseline
# Ours
$ python3 main.py $dataset --test --n_iter 4 --scale 2.33 --model resnet18 \
    --checkpoint resnet18_ours --alpha 0.6 --s 0.02
  1. A simple script to test all corruptions and collect results.
# Modify tools/eval_imagenet100c.py and run it to generate script
$ python3 tools/eval_imagenet100c.py /home2/ImageNet100-C/ > run.sh
# Evaluate
$ bash run.sh
# Collect results
$ python3 tools/collect_imagenet100c.py

Training

$ export dataset="--train_dir /path/to/ImageNet100/train \
    --val_dir /path/to/ImageNet100/val \
    --dataset imagenet --num_class 100"
# Baseline
$ python3 main.py $dataset --epochs 400 --n_iter 1 --scale 1.0 \
    --model resnet18 --gpu 0,1,2,3
# Ours
$ python3 main.py $dataset --epochs 400 --n_iter 4 --scale 2.33 \
    --model resnet18 --alpha 0.6 --s 0.02  --gpu 0,1,2,3

Check tensorboard for the logs. (When training with multiple gpus, the log value may be scaled by the number of gpus except for the validation accuracy)

tensorboard  --logdir=logs

Note that we left our exploration in the code for further study, e.g., self-supervised spatial guidance, dynamic gradient re-scaling operation.

Owner
LInkedIn https://www.linkedin.com/in/sia-huat-tan-2bb6911a5/
PyTorch implementation of the implicit Q-learning algorithm (IQL)

Implicit-Q-Learning (IQL) PyTorch implementation of the implicit Q-learning algorithm IQL (Paper) Currently only implemented for online learning. Offl

Sebastian Dittert 27 Dec 30, 2022
Subnet Replacement Attack: Towards Practical Deployment-Stage Backdoor Attack on Deep Neural Networks

Subnet Replacement Attack: Towards Practical Deployment-Stage Backdoor Attack on Deep Neural Networks Official implementation of paper Towards Practic

Xiangyu Qi 8 Dec 30, 2022
Implementation of the paper "Language-agnostic representation learning of source code from structure and context".

Code Transformer This is an official PyTorch implementation of the CodeTransformer model proposed in: D. Zügner, T. Kirschstein, M. Catasta, J. Leskov

Daniel Zügner 131 Dec 13, 2022
Short and long time series classification using convolutional neural networks

time-series-classification Short and long time series classification via convolutional neural networks In this project, we present a novel framework f

35 Oct 22, 2022
This is RFA-Toolbox, a simple and easy-to-use library that allows you to optimize your neural network architectures using receptive field analysis (RFA) and create graph visualizations of your architecture.

ReceptiveFieldAnalysisToolbox This is RFA-Toolbox, a simple and easy-to-use library that allows you to optimize your neural network architectures usin

84 Nov 23, 2022
PyTorch implementation of: Michieli U. and Zanuttigh P., "Continual Semantic Segmentation via Repulsion-Attraction of Sparse and Disentangled Latent Representations", CVPR 2021.

Continual Semantic Segmentation via Repulsion-Attraction of Sparse and Disentangled Latent Representations This is the official PyTorch implementation

Multimedia Technology and Telecommunication Lab 42 Nov 09, 2022
Source code of AAAI 2022 paper "Towards End-to-End Image Compression and Analysis with Transformers".

Towards End-to-End Image Compression and Analysis with Transformers Source code of our AAAI 2022 paper "Towards End-to-End Image Compression and Analy

37 Dec 21, 2022
FaceVerse: a Fine-grained and Detail-controllable 3D Face Morphable Model from a Hybrid Dataset (CVPR2022)

FaceVerse FaceVerse: a Fine-grained and Detail-controllable 3D Face Morphable Model from a Hybrid Dataset Lizhen Wang, Zhiyuan Chen, Tao Yu, Chenguang

Lizhen Wang 219 Dec 28, 2022
Building a real-time environment using webcam frame division in OpenCV and classify cropped images using a fine-tuned vision transformers on hybryd datasets samples for facial emotion recognition.

Visual Transformer for Facial Emotion Recognition (FER) This project has the aim to build an efficient Visual Transformer for the Facial Emotion Recog

Mario Sessa 8 Dec 12, 2022
Jigsaw Rate Severity of Toxic Comments

Jigsaw Rate Severity of Toxic Comments

Guanshuo Xu 66 Nov 30, 2022
Reinforcement learning models in ViZDoom environment

DoomNet DoomNet is a ViZDoom agent trained by reinforcement learning. The agent is a neural network that outputs a probability of actions given only p

Andrey Kolishchak 126 Dec 09, 2022
Jittor implementation of PCT:Point Cloud Transformer

PCT: Point Cloud Transformer This is a Jittor implementation of PCT: Point Cloud Transformer.

MenghaoGuo 547 Jan 03, 2023
P-Tuning v2: Prompt Tuning Can Be Comparable to Finetuning Universally Across Scales and Tasks

P-tuning v2 P-Tuning v2: Prompt Tuning Can Be Comparable to Finetuning Universally Across Scales and Tasks An optimized prompt tuning strategy for sma

THUDM 540 Dec 30, 2022
基于tensorflow 2.x的图片识别工具集

Classification.tf2 基于tensorflow 2.x的图片识别工具集 功能 粗粒度场景图片分类 细粒度场景图片分类 其他场景图片分类 模型部署 tensorflow serving本地推理和docker部署 tensorRT onnx ... 数据集 https://hyper.a

Wei Qi 1 Nov 03, 2021
Pytorch implementation of MaskFlownet

MaskFlownet-Pytorch Unofficial PyTorch implementation of MaskFlownet (https://github.com/microsoft/MaskFlownet). Tested with: PyTorch 1.5.0 CUDA 10.1

Daniele Cattaneo 84 Nov 02, 2022
PiRank: Learning to Rank via Differentiable Sorting

PiRank: Learning to Rank via Differentiable Sorting This repository provides a reference implementation for learning PiRank-based models as described

54 Dec 17, 2022
Source codes for the paper "Local Additivity Based Data Augmentation for Semi-supervised NER"

LADA This repo contains codes for the following paper: Jiaao Chen*, Zhenghui Wang*, Ran Tian, Zichao Yang, Diyi Yang: Local Additivity Based Data Augm

GT-SALT 36 Dec 02, 2022
gACSON software for visualization, processing and analysis of three-dimensional electron microscopy images

gACSON gACSON software is to visualize, segment, and analyze the morphology of neurons in three-dimensional electron microscopy images. If you use any

Andrea Behanova 2 May 31, 2022
Diverse Object-Scene Compositions For Zero-Shot Action Recognition

Diverse Object-Scene Compositions For Zero-Shot Action Recognition This repository contains the source code for the use of object-scene compositions f

7 Sep 21, 2022
Local Attention - Flax module for Jax

Local Attention - Flax Autoregressive Local Attention - Flax module for Jax Install $ pip install local-attention-flax Usage from jax import random fr

Phil Wang 16 Jun 16, 2022