Implementation of Online Label Smoothing in PyTorch

Overview

Online Label Smoothing

Build Status

Pytorch implementation of Online Label Smoothing (OLS) presented in Delving Deep into Label Smoothing.

Introduction

As the abstract states, OLS is a strategy to generates soft labels based on the statistics of the model prediction for the target category. The core idea is that instead of using fixed soft labels for every epoch, we go updating them based on the stats of correct predicted samples.

More details and experiment results can be found in the paper.

Usage

Usage of OnlineLabelSmoothing is pretty straightforward. Just use it as you would use PyTorch CrossEntropyLoss. The only thing that is different is that at the end of the epoch you should call OnlineLabelSmoothing.next_epoch(). It updates the OnlineLabelSmoothing.supervise matrix that will be used in the next epoch for the soft labels.

Standalone

from ols import OnlineLabelSmoothing
import torch

k = 4  # Number of classes
b = 32  # Batch size
criterion = OnlineLabelSmoothing(alpha=0.5, n_classes=k, smoothing=0.1)
logits = torch.randn(b, k)  # Predictions
y = torch.randint(k, (b,))  # Ground truth

loss = criterion(logits, y)

PyTorch

from ols import OnlineLabelSmoothing

criterion = OnlineLabelSmoothing(alpha=..., n_classes=...)
for epoch in range(...):  # loop over the dataset multiple times
    for i, data in enumerate(...):
        inputs, labels = data
        # zero the parameter gradients
        optimizer.zero_grad()
        # forward + backward + optimize
        outputs = net(inputs)
        loss = criterion(outputs, labels)
        loss.backward()
        optimizer.step()
    print(f'Epoch {epoch} finished!')
    # Update the soft labels for next epoch
    criterion.next_epoch()

PyTorchLightning

With PL you can simply call next_epoch() at the end of the epoch with:

import pytorch_lightning as pl
from ols import OnlineLabelSmoothing


class LitClassification(pl.LightningModule):
    def __init__(self):
        super().__init__()
        self.criterion = OnlineLabelSmoothing(alpha=..., n_classes=...)

    def forward(self, x):
        pass

    def configure_optimizers(self):
        pass

    def training_step(self, train_batch, batch_idx):
        pass

    def on_train_epoch_end(self, **kwargs):
        self.criterion.next_epoch()

Installation

pip install -r requirements.txt

Citation

@misc{zhang2020delving,
      title={Delving Deep into Label Smoothing}, 
      author={Chang-Bin Zhang and Peng-Tao Jiang and Qibin Hou and Yunchao Wei and Qi Han and Zhen Li and Ming-Ming Cheng},
      year={2020},
      eprint={2011.12562},
      archivePrefix={arXiv},
      primaryClass={cs.CV}
}
LoveDA: A Remote Sensing Land-Cover Dataset for Domain Adaptive Semantic Segmentation (NeurIPS2021 Benchmark and Dataset Track)

LoveDA: A Remote Sensing Land-Cover Dataset for Domain Adaptive Semantic Segmentation by Junjue Wang, Zhuo Zheng, Ailong Ma, Xiaoyan Lu, and Yanfei Zh

Kingdrone 174 Dec 22, 2022
Official codebase for "B-Pref: Benchmarking Preference-BasedReinforcement Learning" contains scripts to reproduce experiments.

B-Pref Official codebase for B-Pref: Benchmarking Preference-BasedReinforcement Learning contains scripts to reproduce experiments. Install conda env

48 Dec 20, 2022
FewBit — a library for memory efficient training of large neural networks

FewBit FewBit — a library for memory efficient training of large neural networks. Its efficiency originates from storage optimizations applied to back

24 Oct 22, 2022
A python3 tool to take a 360 degree survey of the RF spectrum (hamlib + rotctld + RTL-SDR/HackRF)

RF Light House (rflh) A python script to use a rotor and a SDR device (RTL-SDR or HackRF One) to measure the RF level around and get a data set and be

Pavel Milanes (CO7WT) 11 Dec 13, 2022
Facilitating Database Tuning with Hyper-ParameterOptimization: A Comprehensive Experimental Evaluation

A Comprehensive Experimental Evaluation for Database Configuration Tuning This is the source code to the paper "Facilitating Database Tuning with Hype

DAIR Lab 9 Oct 29, 2022
Code for our paper "Multi-scale Guided Attention for Medical Image Segmentation"

Medical Image Segmentation with Guided Attention This repository contains the code of our paper: "'Multi-scale self-guided attention for medical image

Ashish Sinha 394 Dec 28, 2022
A library for preparing, training, and evaluating scalable deep learning hybrid recommender systems using PyTorch.

collie Collie is a library for preparing, training, and evaluating implicit deep learning hybrid recommender systems, named after the Border Collie do

ShopRunner 96 Dec 29, 2022
Implementation for the "Surface Reconstruction from 3D Line Segments" paper.

Surface Reconstruction from 3D Line Segments Surface reconstruction from 3d line segments. Langlois, P. A., Boulch, A., & Marlet, R. In 2019 Internati

85 Jan 04, 2023
Breaching - Breaching privacy in federated learning scenarios for vision and text

Breaching - A Framework for Attacks against Privacy in Federated Learning This P

Jonas Geiping 139 Jan 03, 2023
Re-implement CycleGAN in Tensorlayer

CycleGAN_Tensorlayer Re-implement CycleGAN in TensorLayer Original CycleGAN Improved CycleGAN with resize-convolution Prerequisites: TensorLayer Tenso

89 Aug 15, 2022
Twins: Revisiting the Design of Spatial Attention in Vision Transformers

Twins: Revisiting the Design of Spatial Attention in Vision Transformers Very recently, a variety of vision transformer architectures for dense predic

482 Dec 18, 2022
Learning-based agent for Google Research Football

TiKick 1.Introduction Learning-based agent for Google Research Football Code accompanying the paper "TiKick: Towards Playing Multi-agent Football Full

Tsinghua AI Research Team for Reinforcement Learning 90 Dec 26, 2022
Old Photo Restoration (Official PyTorch Implementation)

Bringing Old Photo Back to Life (CVPR 2020 oral)

Microsoft 11.3k Dec 30, 2022
基于Flask开发后端、VUE开发前端框架,在WEB端部署YOLOv5目标检测模型

基于Flask开发后端、VUE开发前端框架,在WEB端部署YOLOv5目标检测模型

37 Jan 01, 2023
PyTorch implementation of the NIPS-17 paper "Poincaré Embeddings for Learning Hierarchical Representations"

Poincaré Embeddings for Learning Hierarchical Representations PyTorch implementation of Poincaré Embeddings for Learning Hierarchical Representations

Facebook Research 1.6k Dec 25, 2022
Network Compression via Central Filter

Network Compression via Central Filter Environments The code has been tested in the following environments: Python 3.8 PyTorch 1.8.1 cuda 10.2 torchsu

2 May 12, 2022
Animatable Neural Radiance Fields for Modeling Dynamic Human Bodies

To make the comparison with Animatable NeRF easier on the Human3.6M dataset, we save the quantitative results at here, which also contains the results of other methods, including Neural Body, D-NeRF,

ZJU3DV 359 Jan 08, 2023
A smart Chat bot that can help to know about corona virus and Make prediction of corona using X-ray.

TRINIT_Hum_kuchh_nahi_karenge_ML01 Document Link https://github.com/Jatin-Goyal-552/TRINIT_Hum_kuchh_nahi_karenge_ML01/blob/main/hum_kuchh_nahi_kareng

JatinGoyal 1 Feb 03, 2022
The ICS Chat System project for NYU Shanghai Fall 2021

ICS_Chat_System [Catenger] This is the ICS Chat System project for NYU Shanghai Fall 2021 Creators: Shavarsh Melikyan, Skyler Chen and Arghya Sarkar,

1 Dec 20, 2021
Real-time Neural Representation Fusion for Robust Volumetric Mapping

NeuralBlox: Real-Time Neural Representation Fusion for Robust Volumetric Mapping Paper | Supplementary This repository contains the implementation of

ETHZ ASL 106 Dec 24, 2022