Consistency Regularization for Adversarial Robustness

Overview

Consistency Regularization for Adversarial Robustness

Official PyTorch implementation of Consistency Regularization for Adversarial Robustness by Jihoon Tack, Sihyun Yu, Jongheon Jeong, Minseon Kim, Sung Ju Hwang, and Jinwoo Shin.

1. Dependencies

conda create -n con-adv python=3
conda activate con-adv

conda install pytorch torchvision cudatoolkit=11.0 -c pytorch 

pip install git+https://github.com/fra31/auto-attack
pip install advertorch tensorboardX

2. Training

2.1. Training option and description

The option for the training method is as follows:

  • <DATASET>: {cifar10,cifar100,tinyimagenet}
  • <AUGMENT>: {base,ccg}
  • <ADV_TRAIN OPTION>: {adv_train,adv_trades,adv_mart}

Current code are assuming l_infinity constraint adversarial training and PreAct-ResNet-18 as a base model.
To change the option, simply modify the following configurations:

  • WideResNet-34-10: --model wrn3410
  • l_2 constraint: --distance L2

2.2. Training code

Standard cross-entropy training

% Standard cross-entropy
python train.py --mode ce --augment base --dataset <DATASET>

Adversarial training

% Adversarial training
python train.py --mode <ADV_TRAIN OPTION> --augment <AUGMENT> --dataset <DATASET>

% Example: Standard AT under CIFAR-10
python train.py --mode adv_train --augment base --dataset cifar10

Consistency regularization

% Consistency regularization
python train.py --consistency --mode <ADV_TRAIN OPTION> --augment <AUGMENT> --dataset <DATASET>

% Example: Consistency regularization based on standard AT under CIFAR-10
python train.py --consistency --mode adv_train --augment ccg --dataset cifar10 

3. Evaluation

3.1. Evaluation option and description

The description for treat model is as follows:

  • <DISTANCE>: {Linf,L2,L1}, the norm constraint type
  • <EPSILON>: the epsilon ball size
  • <ALPHA>: the step size of PGD optimization
  • <NUM_ITER>: iteration number of PGD optimization

3.2. Evaluation code

Evaluate clean accuracy

python eval.py --mode test_clean_acc --dataset <DATASET> --load_path <MODEL_PATH>

Evaluate clean & robust accuracy against PGD

python eval.py --mode test_adv_acc --distance <DISTANCE> --epsilon <EPSILON> --alpha <ALPHA> --n_iters <NUM_ITER> --dataset <DATASET> --load_path <MODEL_PATH>

Evaluate clean & robust accuracy against AutoAttack

python eval.py --mode test_auto_attack --epsilon <EPSILON> --distance <DISTANCE> --dataset <DATASET> --load_path <MODEL_PATH>

Evaluate mean corruption error (mCE)

python eval.py --mode test_mce --dataset <DATASET> --load_path <MODEL_PATH>

4. Results

White box attack

Clean accuracy and robust accuracy (%) against white-box attacks on PreAct-ResNet-18 trained on CIFAR-10.
We use l_infinity threat model with epsilon = 8/255.

Method Clean PGD-20 PGD-100 AutoAttack
Standard AT 84.48 46.09 45.89 40.74
+ Consistency (Ours) 84.65 54.86 54.67 47.83
TRADES 81.35 51.41 51.13 46.41
+ Consistency (Ours) 81.10 54.86 54.68 48.30
MART 81.35 49.60 49.41 41.89
+ Consistency (Ours) 81.10 55.31 55.16 47.02

Unseen adversaries

Robust accuracy (%) of PreAct-ResNet-18 trained with of l_infinity epsilon = 8/255 constraint against unseen attacks.
For unseen attacks, we use PGD-100 under different sized l_infinity epsilon balls, and other types of norm balls.

Method l_infinity, eps=16/255 l_2, eps=300/255 l_1, eps=4000/255
Standard AT 15.77 26.91 32.44
+ Consistency (Ours) 22.49 34.43 42.45
TRADES 23.87 28.31 28.64
+ Consistency (Ours) 27.18 37.11 46.73
MART 20.08 30.15 27.00
+ Consistency (Ours) 27.91 38.10 43.29

Mean corruption error

Mean corruption error (mCE) (%) of PreAct-ResNet-18 trained on CIFAR-10, and tested with CIFAR-10-C dataset

Method mCE
Standard AT 24.05
+ Consistency (Ours) 22.06
TRADES 26.17
+ Consistency (Ours) 24.05
MART 27.75
+ Consistency (Ours) 26.75

Reference

Weakly-supervised object detection.

Wetectron Wetectron is a software system that implements state-of-the-art weakly-supervised object detection algorithms. Project CVPR'20, ECCV'20 | Pa

NVIDIA Research Projects 342 Jan 05, 2023
On Evaluation Metrics for Graph Generative Models

On Evaluation Metrics for Graph Generative Models Authors: Rylee Thompson, Boris Knyazev, Elahe Ghalebi, Jungtaek Kim, Graham Taylor This is the offic

13 Jan 07, 2023
PyBrain - Another Python Machine Learning Library.

PyBrain -- the Python Machine Learning Library =============================================== INSTALLATION ------------ Quick answer: make sure you

2.8k Dec 31, 2022
Several simple examples for popular neural network toolkits calling custom CUDA operators.

Neural Network CUDA Example Several simple examples for neural network toolkits (PyTorch, TensorFlow, etc.) calling custom CUDA operators. We provide

WeiYang 798 Jan 01, 2023
This is a Python wrapper for TA-LIB based on Cython instead of SWIG.

TA-Lib This is a Python wrapper for TA-LIB based on Cython instead of SWIG. From the homepage: TA-Lib is widely used by trading software developers re

John Benediktsson 7.3k Jan 03, 2023
Api's bulid in Flask perfom to manage Todo Task.

Citymall-task Api's bulid in Flask perfom to manage Todo Task. Installation Requrements : Python: 3.10.0 MongoDB create .env file with variables DB_UR

Aisha Tayyaba 1 Dec 17, 2021
Deep Implicit Moving Least-Squares Functions for 3D Reconstruction

DeepMLS: Deep Implicit Moving Least-Squares Functions for 3D Reconstruction This repository contains the implementation of the paper: Deep Implicit Mo

103 Dec 22, 2022
This is the official Pytorch implementation of the paper "Diverse Motion Stylization for Multiple Style Domains via Spatial-Temporal Graph-Based Generative Model"

Diverse Motion Stylization (Official) This is the official Pytorch implementation of this paper. Diverse Motion Stylization for Multiple Style Domains

Soomin Park 28 Dec 16, 2022
This is an unofficial implementation of the paper “Student-Teacher Feature Pyramid Matching for Unsupervised Anomaly Detection”.

This is an unofficial implementation of the paper “Student-Teacher Feature Pyramid Matching for Unsupervised Anomaly Detection”.

haifeng xia 32 Oct 26, 2022
Gesture recognition on Event Data

Event based Gesture Recognition Gesture recognition on Event Data usually involv

2 Feb 14, 2022
A memory-efficient implementation of DenseNets

efficient_densenet_pytorch A PyTorch =1.0 implementation of DenseNets, optimized to save GPU memory. Recent updates Now works on PyTorch 1.0! It uses

Geoff Pleiss 1.4k Dec 25, 2022
StudioGAN is a Pytorch library providing implementations of representative Generative Adversarial Networks (GANs) for conditional/unconditional image generation.

StudioGAN is a Pytorch library providing implementations of representative Generative Adversarial Networks (GANs) for conditional/unconditional image generation.

3k Jan 08, 2023
Metrics to evaluate quality and efficacy of synthetic datasets.

An Open Source Project from the Data to AI Lab, at MIT Metrics for Synthetic Data Generation Projects Website: https://sdv.dev Documentation: https://

The Synthetic Data Vault Project 129 Jan 03, 2023
Implementation of SE3-Transformers for Equivariant Self-Attention, in Pytorch.

SE3 Transformer - Pytorch Implementation of SE3-Transformers for Equivariant Self-Attention, in Pytorch. May be needed for replicating Alphafold2 resu

Phil Wang 207 Dec 23, 2022
neural image generation

pixray Pixray is an image generation system. It combines previous ideas including: Perception Engines which uses image augmentation and iteratively op

dribnet 398 Dec 17, 2022
Code for Graph-to-Tree Learning for Solving Math Word Problems (ACL 2020)

Graph-to-Tree Learning for Solving Math Word Problems PyTorch implementation of Graph based Math Word Problem solver described in our ACL 2020 paper G

Jipeng Zhang 66 Nov 23, 2022
Fastshap: A fast, approximate shap kernel

fastshap: A fast, approximate shap kernel fastshap was designed to be: Fast Calculating shap values can take an extremely long time. fastshap utilizes

Samuel Wilson 22 Sep 24, 2022
This repository contains the code for our fast polygonal building extraction from overhead images pipeline.

Polygonal Building Segmentation by Frame Field Learning We add a frame field output to an image segmentation neural network to improve segmentation qu

Nicolas Girard 186 Jan 04, 2023
PyTorch implementation of the paper Dynamic Token Normalization Improves Vision Transfromers.

Dynamic Token Normalization Improves Vision Transformers This is the PyTorch implementation of the paper Dynamic Token Normalization Improves Vision T

Wenqi Shao 20 Oct 09, 2022
This repository provides code for "On Interaction Between Augmentations and Corruptions in Natural Corruption Robustness".

On Interaction Between Augmentations and Corruptions in Natural Corruption Robustness This repository provides the code for the paper On Interaction B

Meta Research 33 Dec 08, 2022