A highly modular PyTorch framework with a focus on Neural Architecture Search (NAS).

Related tags

Deep Learninguninas
Overview

UniNAS

A highly modular PyTorch framework with a focus on Neural Architecture Search (NAS).

under development

(which happens mostly on our internal GitLab, we push only every once in a while to Github)

  • APIs may change
  • argparse arguments may be moved to more fitting classes
  • there may be incomplete or not-yet-working pieces of code
  • ...

Features

  • modular and therefore reusable
    • data set loading,
    • network building code and topologies,
    • methods to train architecture weights,
    • sets of operations (primitives),
    • weight initializers,
    • metrics,
    • ... and more
  • everything is configurable from the command line and/or config files
    • improved reproducibility, since detailed run configurations are saved and logged
    • powerful search network descriptions enable e.g. highly customizable weight sharing settings
    • the underlying argparse mechanism enables using a GUI for configurations
  • compare results of different methods in the same environment
  • import and export detailed network descriptions
  • integrate new methods and more with fairly little effort
  • NAS-Benchmark integration
    • NAS-Bench 201
  • ... and more

Where is this code from?

Except for a few pieces, the code is entirely self-written. However, sometimes the (official) code is useful to learn from or clear up some details, and other frameworks can be used for their nice features.

Other meta-NAS frameworks

  • Deep Architect
    • highly customizable search spaces, hyperparameters, ...
    • the searchers (SMBO, MCTS, ...) focus on fully training (many) models and are not differentiable
  • D-X-Y NAS-Projects
  • Auto-PyTorch
    • stronger focus on model selection than optimizing one architecture
  • Vega
  • NNI

Repository notes

Dynamic argparse tree

Everything is an argument. Learning rate? Argument. Scheduler? Argument. The exact topology of a Network, including how many of each cell and whether they share their architecture weights? Also arguments.

This is enabled by the idea that each used class (method, network, cells, regularizers, ...) can add arguments to argparse, including which further classes are required (e.g. a method needs a network, which needs a stem).

It starts with the Main class adding a Task (cls_task), which itself adds all required components (cls_*).

To see all available (meta) arguments, run Main.list_all_arguments() in uninas/main.py

Graphical user interface

Since putting together the arguments correctly is not trivial (and requires some familiarity with the code base), an easier approach is using a GUI.

Have a look at uninas/gui/tk_gui/main.py, a tkinter GUI frontend.

The GUI can automatically filter usable classes, display available arguments, and display tooltips; based only on the implemented argparse (meta) arguments in the respective classes.

Some meta arguments take a single class name:

e.g: cls_task, cls_trainer, cls_data, cls_criterion, cls_method

The chosen classes define their own arguments, e.g.:

  • cls_trainer="SimpleTrainer"
  • SimpleTrainer.max_epochs=100
  • SimpleTrainer.test_last=10

Their names are also available as wildcards, automatically using their respectively set class name:

  • cls_trainer="SimpleTrainer"
  • {cls_trainer}.max_epochs --> SimpleTrainer.max_epochs
  • {cls_trainer}.test_last --> SimpleTrainer.test_last

Some meta arguments take a comma-separated list of class names:

e.g. cls_metrics, cls_initializers, cls_regularizers, cls_optimizers, cls_schedulers

The chosen classes also define their own arguments, but always include an index, e.g.:

  • cls_regularizers="DropOutRegularizer, DropPathRegularizer"
  • DropOutRegularizer#0.prob=0.5
  • DropPathRegularizer#1.max_prob=0.3
  • DropPathRegularizer#1.drop_id_paths=false

And they are also available as indexed wildcards:

  • cls_regularizers="DropOutRegularizer, DropPathRegularizer"
  • {cls_regularizers#0}.prob --> DropOutRegularizer#0.prob
  • {cls_regularizers#1}.max_prob --> DropPathRegularizer#1.max_prob
  • {cls_regularizers#1}.drop_id_paths --> DropPathRegularizer#1.drop_id_paths

Register

UniNAS makes heavy use of a registering mechanism (via decorators in uninas/register.py). Classes of the same type (e.g. optimizers, networks, ...) will register in one RegisterDict.

Registered classes can be accessed via their name in the Register, no matter of their actual location in the code. This enables e.g. saving network topologies as nested dictionaries, no matter how complicated they are, since the class names are enough to find the classes in the code. (It also grants a certain amount of refactoring-freedom.)

Exporting networks

(Trained) Networks can easily be used by other PyTorch frameworks/scripts, see verify.py for an easy example.

Citation

The framework

we will possibly create a whitepaper at some point

@misc{kl2020uninas,
  author = {Kevin Alexander Laube},
  title = {UniNAS},
  year = {2020},
  publisher = {GitHub},
  journal = {GitHub repository},
  howpublished = {\url{https://github.com/cogsys-tuebingen/uninas}}
}

Inter-choice dependent super-network weights

  1. Train super-networks, e.g. via experiments/demo/inter_choice_weights/icw1_train_supernet_nats.py
    • you will need Cifar10, but can also easily use fake data or download it
    • to generate SubImageNet see uninas/utils/generate/data/subImageNet
  2. Evaluate the super-network, e.g. via experiments/demo/inter_choice_weights/icw2_eval_supernet.py
  3. View the evaluation results in the save dir, in TensorBoard or plotted directly
@article{laube2021interchoice,
  title={Inter-choice dependent super-network weights},
  author={Kevin Alexander Laube, Andreas Zell},
  journal={arXiv preprint arXiv:2104.11522},
  year={2021}
}
Owner
Cognitive Systems Research Group
Autonomous Mobile Robots; Bioinformatics; Chemo- and Geoinformatics; Evolutionary Algorithms; Machine Learning
Cognitive Systems Research Group
PyTorch implementation of U-TAE and PaPs for satellite image time series panoptic segmentation.

Panoptic Segmentation of Satellite Image Time Series with Convolutional Temporal Attention Networks (ICCV 2021) This repository is the official implem

71 Jan 04, 2023
Resources related to our paper "CLIN-X: pre-trained language models and a study on cross-task transfer for concept extraction in the clinical domain"

CLIN-X (CLIN-X-ES) & (CLIN-X-EN) This repository holds the companion code for the system reported in the paper: "CLIN-X: pre-trained language models a

Bosch Research 4 Dec 05, 2022
Woosung Choi 63 Nov 14, 2022
TensorFlow GNN is a library to build Graph Neural Networks on the TensorFlow platform.

TensorFlow GNN This is an early (alpha) release to get community feedback. It's under active development and we may break API compatibility in the fut

889 Dec 30, 2022
Put blind watermark into a text with python

text_blind_watermark Put blind watermark into a text. Can be used in Wechat dingding ... How to Use install pip install text_blind_watermark Alice Pu

郭飞 164 Dec 30, 2022
The 1st Place Solution of the Facebook AI Image Similarity Challenge (ISC21) : Descriptor Track.

ISC21-Descriptor-Track-1st The 1st Place Solution of the Facebook AI Image Similarity Challenge (ISC21) : Descriptor Track. You can check our solution

lyakaap 75 Jan 08, 2023
OpenMMLab's Next Generation Video Understanding Toolbox and Benchmark

Introduction English | 简体中文 MMAction2 is an open-source toolbox for video understanding based on PyTorch. It is a part of the OpenMMLab project. The m

OpenMMLab 2.7k Jan 07, 2023
Neural Architecture Search Powered by Swarm Intelligence 🐜

Neural Architecture Search Powered by Swarm Intelligence 🐜 DeepSwarm DeepSwarm is an open-source library which uses Ant Colony Optimization to tackle

288 Oct 28, 2022
Illuminated3D This project participates in the Nasa Space Apps Challenge 2021.

Illuminated3D This project participates in the Nasa Space Apps Challenge 2021.

Eleftheriadis Emmanouil 1 Oct 09, 2021
Train neural network for semantic segmentation (deep lab V3) with pytorch in less then 50 lines of code

Train neural network for semantic segmentation (deep lab V3) with pytorch in 50 lines of code Train net semantic segmentation net using Trans10K datas

17 Dec 19, 2022
Image-Adaptive YOLO for Object Detection in Adverse Weather Conditions

Image-Adaptive YOLO for Object Detection in Adverse Weather Conditions Accepted by AAAI 2022 [arxiv] Wenyu Liu, Gaofeng Ren, Runsheng Yu, Shi Guo, Jia

liuwenyu 245 Dec 16, 2022
The official codes for the ICCV2021 presentation "Uniformity in Heterogeneity: Diving Deep into Count Interval Partition for Crowd Counting"

UEPNet (ICCV2021 Poster Presentation) This repository contains codes for the official implementation in PyTorch of UEPNet as described in Uniformity i

Tencent YouTu Research 15 Dec 14, 2022
Supplementary code for the paper "Meta-Solver for Neural Ordinary Differential Equations" https://arxiv.org/abs/2103.08561

Meta-Solver for Neural Ordinary Differential Equations Towards robust neural ODEs using parametrized solvers. Main idea Each Runge-Kutta (RK) solver w

Julia Gusak 25 Aug 12, 2021
Pytorch implementation of set transformer

set_transformer Official PyTorch implementation of the paper Set Transformer: A Framework for Attention-based Permutation-Invariant Neural Networks .

Juho Lee 410 Jan 06, 2023
An example to implement a new backbone with OpenMMLab framework.

Backbone example on OpenMMLab framework English | 简体中文 Introduction This is an template repo about how to use OpenMMLab framework to develop a new bac

Ma Zerun 22 Dec 29, 2022
Distributional Sliced-Wasserstein distance code

Distributional Sliced Wasserstein distance This is a pytorch implementation of the paper "Distributional Sliced-Wasserstein and Applications to Genera

VinAI Research 39 Jan 01, 2023
(AAAI 2021) Progressive One-shot Human Parsing

End-to-end One-shot Human Parsing This is the official repository for our two papers: Progressive One-shot Human Parsing (AAAI 2021) End-to-end One-sh

54 Dec 30, 2022
NLP From Scratch Without Large-Scale Pretraining: A Simple and Efficient Framework

NLP From Scratch Without Large-Scale Pretraining This repository contains the code, pre-trained model checkpoints and curated datasets for our paper:

Xingcheng Yao 224 Dec 08, 2022
“袋鼯麻麻——智能购物平台”能够精准地定位识别每一个商品

“袋鼯麻麻——智能购物平台”能够精准地定位识别每一个商品,并且能够返回完整地购物清单及顾客应付的实际商品总价格,极大地降低零售行业实际运营过程中巨大的人力成本,提升零售行业无人化、自动化、智能化水平。

thomas-yanxin 192 Jan 05, 2023
To model the probability of a soccer coach leave his/her team during Campeonato Brasileiro for 10 chosen teams and considering years 2018, 2019 and 2020.

To model the probability of a soccer coach leave his/her team during Campeonato Brasileiro for 10 chosen teams and considering years 2018, 2019 and 2020.

Larissa Sayuri Futino Castro dos Santos 1 Jan 20, 2022