Open source implementation of AceNAS: Learning to Rank Ace Neural Architectures with Weak Supervision of Weight Sharing

Overview

AceNAS

This repo is the experiment code of AceNAS, and is not considered as an official release. We are working on integrating AceNAS as a built-in strategy provided in NNI.

Data Preparation

  1. Download our prepared data from Google Drive. The directory should look like this:
data
├── checkpoints
│   ├── acenas-m1.pth.tar
│   ├── acenas-m2.pth.tar
│   └── acenas-m3.pth.tar
├── gcn
│   ├── nasbench101_gt_all.pkl
│   ├── nasbench201cifar10_gt_all.pkl
│   ├── nasbench201_gt_all.pkl
│   ├── nasbench201imagenet_gt_all.pkl
│   ├── nds_amoeba_gt_all.pkl
│   ├── nds_amoebaim_gt_all.pkl
│   ├── nds_dartsfixwd_gt_all.pkl
│   ├── nds_darts_gt_all.pkl
│   ├── nds_dartsim_gt_all.pkl
│   ├── nds_enasfixwd_gt_all.pkl
│   ├── nds_enas_gt_all.pkl
│   ├── nds_enasim_gt_all.pkl
│   ├── nds_nasnet_gt_all.pkl
│   ├── nds_nasnetim_gt_all.pkl
│   ├── nds_pnasfixwd_gt_all.pkl
│   ├── nds_pnas_gt_all.pkl
│   ├── nds_pnasim_gt_all.pkl
│   ├── nds_supernet_evaluate_all_test1_amoeba.json
│   ├── nds_supernet_evaluate_all_test1_dartsfixwd.json
│   ├── nds_supernet_evaluate_all_test1_darts.json
│   ├── nds_supernet_evaluate_all_test1_enasfixwd.json
│   ├── nds_supernet_evaluate_all_test1_enas.json
│   ├── nds_supernet_evaluate_all_test1_nasnet.json
│   ├── nds_supernet_evaluate_all_test1_pnasfixwd.json
│   ├── nds_supernet_evaluate_all_test1_pnas.json
│   ├── supernet_evaluate_all_test1_nasbench101.json
│   ├── supernet_evaluate_all_test1_nasbench201cifar10.json
│   ├── supernet_evaluate_all_test1_nasbench201imagenet.json
│   └── supernet_evaluate_all_test1_nasbench201.json
├── nb201
│   ├── split-cifar100.txt
│   ├── split-cifar10-valid.txt
│   └── split-imagenet-16-120.txt
├── proxyless
│   ├── imagenet
│   │   ├── augment_files.txt
│   │   ├── test_files.txt
│   │   ├── train_files.txt
│   │   └── val_files.txt
│   ├── proxyless-84ms-train.csv
│   ├── proxyless-ws-results.csv
│   └── tunas-proxylessnas-search.csv
└── tunas
    ├── imagenet_valid_split_filenames.txt
    ├── random_architectures.csv
    └── searched_architectures.csv
  1. (Required for benchmark experiments) Download CIFAR-10, CIFAR-100, ImageNet-16-120 dataset and also put them under data.
data
├── cifar10
│   └── cifar-10-batches-py
│       ├── batches.meta
│       ├── data_batch_1
│       ├── data_batch_2
│       ├── data_batch_3
│       ├── data_batch_4
│       ├── data_batch_5
│       ├── readme.html
│       └── test_batch
├── cifar100
│   └── cifar-100-python
│       ├── meta
│       ├── test
│       └── train
└── imagenet16
    ├── train_data_batch_1
    ├── train_data_batch_10
    ├── train_data_batch_2
    ├── train_data_batch_3
    ├── train_data_batch_4
    ├── train_data_batch_5
    ├── train_data_batch_6
    ├── train_data_batch_7
    ├── train_data_batch_8
    ├── train_data_batch_9
    └── val_data
  1. (Required for ImageNet experiments) Prepare ImageNet. You can put it anywhere.

  2. (Optional) Copy tunas (https://github.com/google-research/google-research/tree/master/tunas) to a folder named tunas.

Evaluate pre-trained models.

We provide 3 checkpoints obtained from 3 different runs in data/checkpoints. Please evaluate them via the following command.

python -m tools.standalone.imagenet_eval acenas-m1 /path/to/your/imagenet
python -m tools.standalone.imagenet_eval acenas-m2 /path/to/your/imagenet
python -m tools.standalone.imagenet_eval acenas-m3 /path/to/your/imagenet

Train supernet

python -m tools.supernet.nasbench101 experiments/supernet/nasbench101.yml
python -m tools.supernet.nasbench201 experiments/supernet/nasbench201.yml
python -m tools.supernet.nds experiments/supernet/darts.yml
python -m tools.supernet.proxylessnas experiments/supernet/proxylessnas.yml

Please refer to experiments/supernet folder for more configurations.

Benchmark experiments

We've already provided weight-sharing results from supernet so that you do not have to train you own. The provided files can be found in json files located under data/gcn.

# pretrain
python -m gcn.benchmarks.pretrain data/gcn/supernet_evaluate_all_test1_${SEARCHSPACE}.json data/gcn/${SEARCHSPACE}_gt_all.pkl --metric_keys top1 flops params
# finetune
python -m gcn.benchmarks.train --use_train_samples --budget {budget} --test_dataset data/gcn/${SEARCHSPACE}_gt_all.pkl --iteration 5 \
    --loss lambdarank --gnn_type gcn --early_stop_patience 50 --learning_rate 0.005 --opt_type adam --wd 5e-4 --epochs 300 --bs 20 \
    --resume /path/to/previous/output.pt

Running baselines

BRP-NAS:

# pretrain
python -m gcn.benchmarks.pretrain data/gcn/supernet_evaluate_all_test1_${SEARCHSPACE}.json data/gcn/${SEARCHSPACE}_gt_all.pkl --metric_keys flops
# finetune
python -m gcn.benchmarks.train --use_train_samples --budget ${BUDGET} --test_dataset data/gcn/${SEARCHSPACE}_gt_all.pkl --iteration 5 \
    --loss brp --gnn_type brp --early_stop_patience 35 --learning_rate 0.00035 \
    --opt_type adamw --wd 5e-4 --epochs 250 --bs 64 --resume /path/to/previous/output.pt

Vanilla:

python -m gcn.benchmarks.train --use_train_samples --budget ${BUDGET} --test_dataset data/gcn/${SEARCHSPACE}_gt_all.pkl --iteration 1 \
    --loss mse --gnn_type vanilla --n_hidden 144 --learning_rate 2e-4 --opt_type adam --wd 1e-3 --epochs 300 --bs 10

ProxylessNAS search space

Train GCN

python -m gcn.proxyless.pretrain --metric_keys ws_accuracy simulated_pixel1_time_ms flops params
python -m gcn.proxyless.train --loss lambdarank --early_stop_patience 50 --learning_rate 0.002 --opt_type adam --wd 5e-4 --epochs 300 --bs 20 \
    --resume /path/to/previous/output.pth

Train final model

Validation set:

python -m torch.distributed.launch --nproc_per_node=16 \
    --use_env --module \
    tools.standalone.imagenet_train \
    --output "$OUTPUT_DIR" "$ARCH" "$IMAGENET_DIR" \
    -b 256 --lr 2.64 --warmup-lr 0.1 \
    --warmup-epochs 5 --epochs 90 --sched cosine --num-classes 1000 \
    --opt rmsproptf --opt-eps 1. --weight-decay 4e-5 -j 8 --dist-bn reduce \
    --bn-momentum 0.01 --bn-eps 0.001 --drop 0. --no-held-out-val

Test set:

python -m torch.distributed.launch --nproc_per_node=16 \
    --use_env --module \
    tools.standalone.imagenet_train \
    --output "$OUTPUT_DIR" "$ARCH" "$IMAGENET_DIR" \
    -b 256 --lr 2.64 --warmup-lr 0.1 \
    --warmup-epochs 9 --epochs 360 --sched cosine --num-classes 1000 \
    --opt rmsproptf --opt-eps 1. --weight-decay 4e-5 -j 8 --dist-bn reduce \
    --bn-momentum 0.01 --bn-eps 0.001 --drop 0.15
Owner
Yuge Zhang
Yuge Zhang
The Python ensemble sampling toolkit for affine-invariant MCMC

emcee The Python ensemble sampling toolkit for affine-invariant MCMC emcee is a stable, well tested Python implementation of the affine-invariant ense

Dan Foreman-Mackey 1.3k Dec 31, 2022
Recovering Brain Structure Network Using Functional Connectivity

Recovering-Brain-Structure-Network-Using-Functional-Connectivity Framework: Papers: This repository provides a PyTorch implementation of the models ad

5 Nov 30, 2022
A deep learning tabular classification architecture inspired by TabTransformer with integrated gated multilayer perceptron.

The GatedTabTransformer. A deep learning tabular classification architecture inspired by TabTransformer with integrated gated multilayer perceptron. C

Radi Cho 60 Dec 15, 2022
CHERRY is a python library for predicting the interactions between viral and prokaryotic genomes

CHERRY is a python library for predicting the interactions between viral and prokaryotic genomes. CHERRY is based on a deep learning model, which consists of a graph convolutional encoder and a link

Kenneth Shang 12 Dec 15, 2022
Working demo of the Multi-class and Anomaly classification model using the CLIP feature space

👁️ Hindsight AI: Crime Classification With Clip About For Educational Purposes Only This is a recursive neural net trained to classify specific crime

Miles Tweed 2 Jun 05, 2022
Unofficial implementation of Google's FNet: Mixing Tokens with Fourier Transforms

FNet: Mixing Tokens with Fourier Transforms Pytorch implementation of Fnet : Mixing Tokens with Fourier Transforms. Citation: @misc{leethorp2021fnet,

Rishikesh (ऋषिकेश) 218 Jan 05, 2023
Unofficial PyTorch implementation of "RTM3D: Real-time Monocular 3D Detection from Object Keypoints for Autonomous Driving" (ECCV 2020)

RTM3D-PyTorch The PyTorch Implementation of the paper: RTM3D: Real-time Monocular 3D Detection from Object Keypoints for Autonomous Driving (ECCV 2020

Nguyen Mau Dzung 271 Nov 29, 2022
The Dual Memory is build from a simple CNN for the deep memory and Linear Regression fro the fast Memory

Simple-DMA a simple Dual Memory Architecture for classifications. based on the paper Dual-Memory Deep Learning Architectures for Lifelong Learning of

1 Jan 27, 2022
A Python implementation of the Locality Preserving Matching (LPM) method for pruning outliers in image matching.

LPM_Python A Python implementation of the Locality Preserving Matching (LPM) method for pruning outliers in image matching. The code is established ac

AoxiangFan 11 Nov 07, 2022
https://sites.google.com/cornell.edu/recsys2021tutorial

Counterfactual Learning and Evaluation for Recommender Systems (RecSys'21 Tutorial) Materials for "Counterfactual Learning and Evaluation for Recommen

yuta-saito 45 Nov 10, 2022
TransReID: Transformer-based Object Re-Identification

TransReID: Transformer-based Object Re-Identification [arxiv] The official repository for TransReID: Transformer-based Object Re-Identification achiev

569 Dec 30, 2022
Repository for Traffic Accident Benchmark for Causality Recognition (ECCV 2020)

Causality In Traffic Accident (Under Construction) Repository for Traffic Accident Benchmark for Causality Recognition (ECCV 2020) Overview Data Prepa

Tackgeun 21 Nov 20, 2022
A TensorFlow implementation of SOFA, the Simulator for OFfline LeArning and evaluation.

SOFA This repository is the implementation of SOFA, the Simulator for OFfline leArning and evaluation. Keeping Dataset Biases out of the Simulation: A

22 Nov 23, 2022
A PyTorch implementation of EfficientDet.

A PyTorch impl of EfficientDet faithful to the original Google impl w/ ported weights

Ross Wightman 1.4k Jan 07, 2023
An end-to-end implementation of intent prediction with Metaflow and other cool tools

You Don't Need a Bigger Boat An end-to-end (Metaflow-based) implementation of an intent prediction flow for kids who can't MLOps good and wanna learn

Jacopo Tagliabue 614 Dec 31, 2022
PyTorch implementation of our paper How robust are discriminatively trained zero-shot learning models?

How robust are discriminatively trained zero-shot learning models? This repository contains the PyTorch implementation of our paper How robust are dis

Mehmet Kerim Yucel 5 Feb 04, 2022
Self-Supervised Methods for Noise-Removal

SSMNR | Self-Supervised Methods for Noise Removal Image denoising is the task of removing noise from an image, which can be formulated as the task of

1 Jan 16, 2022
Implementation for the IJCAI2021 work "Beyond the Spectrum: Detecting Deepfakes via Re-synthesis"

Beyond the Spectrum Implementation for the IJCAI2021 work "Beyond the Spectrum: Detecting Deepfakes via Re-synthesis" by Yang He, Ning Yu, Margret Keu

Yang He 27 Jan 07, 2023
Official Implementation for HyperStyle: StyleGAN Inversion with HyperNetworks for Real Image Editing

HyperStyle: StyleGAN Inversion with HyperNetworks for Real Image Editing Yuval Alaluf*, Omer Tov*, Ron Mokady, Rinon Gal, Amit H. Bermano *Denotes equ

885 Jan 06, 2023
TLDR; Train custom adaptive filter optimizers without hand tuning or extra labels.

AutoDSP TLDR; Train custom adaptive filter optimizers without hand tuning or extra labels. About Adaptive filtering algorithms are commonplace in sign

Jonah Casebeer 48 Sep 19, 2022