Official repository of the paper 'Essentials for Class Incremental Learning'

Overview

Essentials for Class Incremental Learning

Official repository of the paper 'Essentials for Class Incremental Learning'

This Pytorch repository contains the code for our work Essentials for Class Incremental Learning.

This work presents a straightforward class-incrmental learning system that focuses on the essential components and already exceeds the state of the art without integrating sophisticated modules.

Requirements

To install requirements:

pip install -r requirements.txt

Training and Evaluation (CIFAR-100, ImageNet-100, ImageNet-1k)

Following scripts contain both training and evaluation codes. Model is evaluated after each phase in class-IL.

with Knowledge-distillation (KD)

To train the base CCIL model:

bash ./scripts/run_cifar.sh
bash ./scripts/run_imagenet100.sh
bash ./scripts/run_imagenet1k.sh

To train CCIL + Self-distillation

bash ./scripts/run_cifar_w_sd.sh
bash ./scripts/run_imagenet100_w_sd.sh
bash ./scripts/run_imagenet1k_w_sd.sh

Results (CIFAR-100)

Model name Avg Acc (5 iTasks) Avg Acc (10 iTasks)
CCIL 66.44 64.86
CCIL + SD 67.17 65.86

Results (ImageNet-100)

Model name Avg Acc (5 iTasks) Avg Acc (10 iTasks)
CCIL 77.99 75.99
CCIL + SD 79.44 76.77

Results (ImageNet)

Model name Avg Acc (5 iTasks) Avg Acc (10 iTasks)
CCIL 67.53 65.61
CCIL + SD 68.04 66.25

List of Arguments

  • Distillation Methods

    • Knowledge Distillation (--kd, --w-kd X), X is the weightage for KD loss, default=1.0
    • Representation Distillation (--rd, --w-rd X), X is the weightage for cos-RD loss, default=0.05
    • Contrastive Representation Distillation (--nce, --w-nce X), only valid for CIFAR-100, X is the weightage of NCE loss
  • Regularization for the first task

    • Self-distillation (--num-sd X, --epochs-sd Y), X is number of generations, Y is number of self-distillation epochs
    • Mixup (--mixup, --mixup-alpha X), X is mixup alpha value, default=0.1
    • Heavy Augmentation (--aug)
    • Label Smoothing (--label-smoothing, --smoothing-alpha X), X is a alpha value, default=0.1
  • Incremental class setting

    • No. of base classes (--start-classes 50)
    • 5-phases (--new-classes 10)
    • 10-phases (--new-classes 5)
  • Cosine learning rate decay (--cosine)

  • Save and Load

    • Experiment Name (--exp-name X)
    • Save checkpoints (--save)
    • Resume checkpoints (--resume, --resume-path X), only to resume from first snapshot

Citation

@article{ccil_mittal,
    Author = {Sudhanshu Mittal and Silvio Galesso and Thomas Brox},
    Title = {Essentials for Class Incremental Learning},
    journal = {arXiv preprint arXiv:2102.09517},
    Year = {2021},
}
Official pytorch code for "APP: Anytime Progressive Pruning"

APP: Anytime Progressive Pruning Diganta Misra1,2,3, Bharat Runwal2,4, Tianlong Chen5, Zhangyang Wang5, Irina Rish1,3 1 Mila - Quebec AI Institute,2 L

Landskape AI 12 Nov 22, 2022
Code in conjunction with the publication 'Contrastive Representation Learning for Hand Shape Estimation'

HanCo Dataset & Contrastive Representation Learning for Hand Shape Estimation Code in conjunction with the publication: Contrastive Representation Lea

Computer Vision Group, Albert-Ludwigs-Universität Freiburg 38 Dec 13, 2022
Finding Biological Plausibility for Adversarially Robust Features via Metameric Tasks

Adversarially-Robust-Periphery Code + Data from the paper "Finding Biological Plausibility for Adversarially Robust Features via Metameric Tasks" by A

Anne Harrington 2 Feb 07, 2022
SHRIMP: Sparser Random Feature Models via Iterative Magnitude Pruning

SHRIMP: Sparser Random Feature Models via Iterative Magnitude Pruning This repository is the official implementation of "SHRIMP: Sparser Random Featur

Bobby Shi 0 Dec 16, 2021
MOOSE (Multi-organ objective segmentation) a data-centric AI solution that generates multilabel organ segmentations to facilitate systemic TB whole-person research

MOOSE (Multi-organ objective segmentation) a data-centric AI solution that generates multilabel organ segmentations to facilitate systemic TB whole-person research.The pipeline is based on nn-UNet an

QIMP team 30 Jan 01, 2023
Model Serving Made Easy

The easiest way to build Machine Learning APIs BentoML makes moving trained ML models to production easy: Package models trained with any ML framework

BentoML 4.4k Jan 08, 2023
Reinforcement learning algorithms in RLlib

raylab Reinforcement learning algorithms in RLlib and PyTorch. Installation pip install raylab Quickstart Raylab provides agents and environments to b

Ângelo 50 Sep 08, 2022
Two-Stream Adaptive Graph Convolutional Networks for Skeleton-Based Action Recognition in CVPR19

2s-AGCN Two-Stream Adaptive Graph Convolutional Networks for Skeleton-Based Action Recognition in CVPR19 Note PyTorch version should be 0.3! For PyTor

LShi 547 Dec 26, 2022
Fashion Recommender System With Python

Fashion-Recommender-System Thr growing e-commerce industry presents us with a la

Omkar Gawade 2 Feb 02, 2022
Code repository for the work "Multi-Domain Incremental Learning for Semantic Segmentation", accepted at WACV 2022

Multi-Domain Incremental Learning for Semantic Segmentation This is the Pytorch implementation of our work "Multi-Domain Incremental Learning for Sema

Pgxo20 24 Jan 02, 2023
HMLLDB is a collection of LLDB commands to assist in the debugging of iOS apps.

HMLLDB is a collection of LLDB commands to assist in the debugging of iOS apps. 中文介绍 Features Non-intrusive. Your iOS project does not need to be modi

mao2020 47 Oct 22, 2022
PyTorch Implementation of Fully Convolutional Networks. (Training code to reproduce the original result is available.)

pytorch-fcn PyTorch implementation of Fully Convolutional Networks. Requirements pytorch = 0.2.0 torchvision = 0.1.8 fcn = 6.1.5 Pillow scipy tqdm

Kentaro Wada 1.6k Jan 07, 2023
[NeurIPS 2021 Spotlight] Code for Learning to Compose Visual Relations

Learning to Compose Visual Relations This is the pytorch codebase for the NeurIPS 2021 Spotlight paper Learning to Compose Visual Relations. Demo Imag

Nan Liu 88 Jan 04, 2023
EMNLP'2021: SimCSE: Simple Contrastive Learning of Sentence Embeddings

SimCSE: Simple Contrastive Learning of Sentence Embeddings This repository contains the code and pre-trained models for our paper SimCSE: Simple Contr

Princeton Natural Language Processing 2.5k Dec 29, 2022
Face Recognition Attendance Project

Face-Recognition-Attendance-Project In This Project You will learn how to mark attendance using face recognition, Hello Guys This is Gautam Kumar, Thi

Gautam Kumar 1 Dec 03, 2022
[2021 MultiMedia] CONQUER: Contextual Query-aware Ranking for Video Corpus Moment Retrieval

CONQUER: Contexutal Query-aware Ranking for Video Corpus Moment Retreival PyTorch implementation of CONQUER: Contexutal Query-aware Ranking for Video

Hou zhijian 23 Dec 26, 2022
Propose a principled and practically effective framework for unsupervised accuracy estimation and error detection tasks with theoretical analysis and state-of-the-art performance.

Detecting Errors and Estimating Accuracy on Unlabeled Data with Self-training Ensembles This project is for the paper: Detecting Errors and Estimating

Jiefeng Chen 13 Nov 21, 2022
Machine Learning From Scratch. Bare bones NumPy implementations of machine learning models and algorithms with a focus on accessibility. Aims to cover everything from linear regression to deep learning.

Machine Learning From Scratch About Python implementations of some of the fundamental Machine Learning models and algorithms from scratch. The purpose

Erik Linder-Norén 21.8k Jan 09, 2023
DABO: Data Augmentation with Bilevel Optimization

DABO: Data Augmentation with Bilevel Optimization [Paper] The goal is to automatically learn an efficient data augmentation regime for image classific

ElementAI 24 Aug 12, 2022
Source code and dataset of the paper "Contrastive Adaptive Propagation Graph Neural Networks forEfficient Graph Learning"

CAPGNN Source code and dataset of the paper "Contrastive Adaptive Propagation Graph Neural Networks forEfficient Graph Learning" Paper URL: https://ar

1 Mar 12, 2022