PyTorch Implementation of Temporal Output Discrepancy for Active Learning, ICCV 2021

Overview

Temporal Output Discrepancy for Active Learning

PyTorch implementation of Semi-Supervised Active Learning with Temporal Output Discrepancy, ICCV 2021.

Introduction

  • We present a loss measurement Temporal Output Discrepancy (TOD) that estimates the loss of unlabeled samples by evaluating the distance of model outputs at different SGD steps.
  • We theoretically demonstrate that TOD is a lower-bound of accumulated sample loss.
  • An unlabeled data sampling strategy and a semi-supervised training scheme are developed for active learning based on TOD.

TOD Active Data Selection

Results

Requirements

numpy

torch >= 1.0.1

torchvision >= 0.2.1

Data Preparation

Download image classification datasets (e.g., Cifar-10, Cifar-100, SVHN, or Caltech101) and put them under ./data.

If you would like to try Caltech101 dataset, please download the pretrained ResNet-18 model and put it under ./.

Directory structure should be like:

TOD
|-- data
    |-- 101_ObjectCategories
        |-- accordion
        |-- airplanes
        |-- anchor
        |-- ...
    |-- cifar-10-batches-py
    |-- cifar-100-python
    |-- svhn
        |-- train_32x32.mat
        |-- test_32x32.mat
|-- resnet18-5c106cde.pth
|-- ...

Quick Start

Run TOD active learning experiment on Cifar-10:

bash run.sh

Specify Datasets, Active Sampling Strategies, and Auxiliary Losses

The dataset configurations, active learning settings (trials and cycles), and neural network training settings can be found in ./config folder.

We provide implementations of active data sampling strategies including random sampling, learning loss for active learning (LL4AL), and our TOD sampling. Use --sampling to specify a sampling strategy.

We also provide implementations of auxiliary training losses including LL4AL and our COD loss. Use --auxiliary to specify an auxiliary loss.

Examples

Cifar-100 dataset, TOD sampling, no unsupervised loss:

python main_TOD.py --config cifar100 --sampling TOD --auxiliary NONE

Caltech101 dataset, random sampling, COD loss:

python main_TOD.py --config caltech101 --sampling RANDOM --auxiliary TOD

SVHN dataset, LL4AL sampling, LL4AL loss:

python main_LL4AL.py --config svhn --sampling LL4AL --auxiliary LL4AL

Citation

 @inproceedings{huang2021semi,
  title={Semi-Supervised Active Learning with Temporal Output Discrepancy},
  author={Huang, Siyu and Wang, Tainyang and Xiong, Haoyi and Huan, Jun and Dou, Dejing},
  booktitle={Proceedings of the IEEE/CVF International Conference on Computer Vision (ICCV)},
  year={2021}
 }

Contact

Siyu Huang

[email protected]

Owner
Siyu Huang
Research Fellow
Siyu Huang
Efficient semidefinite bounds for multi-label discrete graphical models.

Low rank solvers #################################### benchmark/ : folder with the random instances used in the paper. ############################

1 Dec 08, 2022
Cross-Image Region Mining with Region Prototypical Network for Weakly Supervised Segmentation

Cross-Image Region Mining with Region Prototypical Network for Weakly Supervised Segmentation The code of: Cross-Image Region Mining with Region Proto

LiuWeide 16 Nov 26, 2022
City-Scale Multi-Camera Vehicle Tracking Guided by Crossroad Zones Code

City-Scale Multi-Camera Vehicle Tracking Guided by Crossroad Zones Requirements Python 3.8 or later with all requirements.txt dependencies installed,

88 Dec 12, 2022
NeRF visualization library under construction

NeRF visualization library using PlenOctrees, under construction pip install nerfvis Docs will be at: https://nerfvis.readthedocs.org import nerfvis s

Alex Yu 196 Jan 04, 2023
[AAAI 2022] Sparse Structure Learning via Graph Neural Networks for Inductive Document Classification

Sparse Structure Learning via Graph Neural Networks for inductive document classification Make graph dataset create co-occurrence graph for datasets.

16 Dec 22, 2022
An essential implementation of BYOL in PyTorch + PyTorch Lightning

Essential BYOL A simple and complete implementation of Bootstrap your own latent: A new approach to self-supervised Learning in PyTorch + PyTorch Ligh

Enrico Fini 48 Sep 27, 2022
PyTorch implementation of CVPR'18 - Perturbative Neural Networks

This is an attempt to reproduce results in Perturbative Neural Networks paper. See original repo for details.

Michael Klachko 57 May 14, 2021
PyTorch implementation of "Simple and Deep Graph Convolutional Networks"

Simple and Deep Graph Convolutional Networks This repository contains a PyTorch implementation of "Simple and Deep Graph Convolutional Networks".(http

chenm 253 Dec 08, 2022
Sum-Product Probabilistic Language

Sum-Product Probabilistic Language SPPL is a probabilistic programming language that delivers exact solutions to a broad range of probabilistic infere

MIT Probabilistic Computing Project 57 Nov 17, 2022
Python implementation of NARS (Non-Axiomatic-Reasoning-System)

Python implementation of NARS (Non-Axiomatic-Reasoning-System)

Bowen XU 11 Dec 20, 2022
Mitsuba 2: A Retargetable Forward and Inverse Renderer

Mitsuba Renderer 2 Documentation Mitsuba 2 is a research-oriented rendering system written in portable C++17. It consists of a small set of core libra

Mitsuba Physically Based Renderer 2k Jan 07, 2023
Accelerating BERT Inference for Sequence Labeling via Early-Exit

Sequence-Labeling-Early-Exit Code for ACL 2021 paper: Accelerating BERT Inference for Sequence Labeling via Early-Exit Requirement: Please refer to re

李孝男 23 Oct 14, 2022
CL-Gym: Full-Featured PyTorch Library for Continual Learning

CL-Gym: Full-Featured PyTorch Library for Continual Learning CL-Gym is a small yet very flexible library for continual learning research and developme

Iman Mirzadeh 36 Dec 25, 2022
a grammar based feedback fuzzer

Nautilus NOTE: THIS IS AN OUTDATE REPOSITORY, THE CURRENT RELEASE IS AVAILABLE HERE. THIS REPO ONLY SERVES AS A REFERENCE FOR THE PAPER Nautilus is a

Chair for Sys­tems Se­cu­ri­ty 158 Dec 28, 2022
CLEAR algorithm for multi-view data association

CLEAR: Consistent Lifting, Embedding, and Alignment Rectification Algorithm The Matlab, Python, and C++ implementation of the CLEAR algorithm, as desc

MIT Aerospace Controls Laboratory 30 Jan 02, 2023
Novel and high-performance medical image classification pipelines are heavily utilizing ensemble learning strategies

An Analysis on Ensemble Learning optimized Medical Image Classification with Deep Convolutional Neural Networks Novel and high-performance medical ima

14 Dec 18, 2022
Pytorch implementation of the paper Improving Text-to-Image Synthesis Using Contrastive Learning

T2I_CL This is the official Pytorch implementation of the paper Improving Text-to-Image Synthesis Using Contrastive Learning Requirements Linux Python

42 Dec 31, 2022
Similarity-based Gray-box Adversarial Attack Against Deep Face Recognition

Similarity-based Gray-box Adversarial Attack Against Deep Face Recognition Introduction Run attack: SGADV.py Objective function: foolbox/attacks/gradi

1 Jul 18, 2022
A small library for doing fluid simulation with neural networks.

Neural Fluid Fields This is a small library for doing fluid simulation with neural fields. Check out our review paper, Neural Fields in Visual Computi

Towaki 23 Jun 23, 2022
deep learning model with only python and numpy with test accuracy 99 % on mnist dataset and different optimization choices

deep_nn_model_with_only_python_100%_test_accuracy deep learning model with only python and numpy with test accuracy 99 % on mnist dataset and differen

0 Aug 28, 2022