This repository is dedicated to developing and maintaining code for experiments with wide neural networks.

Overview

Wide-Networks

This repository contains the code of various experiments on wide neural networks. In particular, we implement classes for abc-parameterizations of NNs as defined by (Yang & Hu 2021). Although an equivalent description can be given using only ac-parameterizations, we keep the 3 scales (a, b and c) in the code to allow more flexibility depending on how we want to approach the problem of dealing with infinitely wide NNs.

Structure of the code

The BaseModel class

All the code related to neural networks is in the directory pytorch. The different models we have implemented are in this directory along with the base class found in the file base_model.py which implements the generic attributes and methods all our NNs classes will share.

The BaseModel class inherits from the Pytorch Lightning module, and essentially defines the necessary attributes for any NN to work properly, namely the architecture (which is defined in the _build_model() method), the activation function (we consider the same activation function at each layer), the loss function, the optimizer and the initializer for the parameters of the network.

Optionally, the BaseModel class can define attributes for the normalization (e.g. BatchNorm, LayerNorm, etc) and the scheduler, and any of the aforementioned attributes (optional or not) can be customized depending on the needs (see examples for the scheduler of ipllr and the initializer of abc_param).

The ModelConfig class

All the hyper-parameters which define the model (depth, width, activation function name, loss name, optimizer name, etc) have to be passed as argument to _init_() as an object of the class ModelConfig (pytorch/configs/model.py). This class reads from a yaml config file which defines all the necessary objects for a NN (see examples in pytorch/configs). Essentially, the class ModelConfig is here so that one only has to set the yaml config file properly and then the attributes are correctly populated in BaseModel via the class ModelConfig.

abc-parameterizations

The code for abc-parameterizations (Yang & Hu 2021) can be found in pytorch/abc_params. There we define the base class for abc-parameterizations, mainly setting the layer, init and lr scales from the values of a,b,c, as well as defining the initial parameters through Gaussians of appropriate variance depending on the value of b and the activation function.

Everything that is architecture specific (fully-connected, conv, residual, etc) is left out of this base class and has to be implemented in the _build_model() method of the child class (see examples in pytorch/abc_params/fully_connected). We also define there the base classes for the ntk, muP (Yang & Hu 2021), ip and ipllr parameterizations, and there fully-connected implementations in pytorch/abc_params/fully_connected.

Experiment runs

Setup

Before running any experiment, make sure you first install all the necessary packages:

pip3 install -r requirements.txt

You can optionally create a virtual environment through

python3 -m venv your_env_dir

then activate it with

source your_env_dir/bin/activate

and then install the requirements once the environment is activated. Now, if you haven't installed the wide-networks library in site-packages, before running the command for your experiment, make sure you first add the wide-networks library to the PYTHONPATH by running the command

export PYTHONPATH=$PYTHONPATH:"$PWD"

from the root directory (wide-networks/.) of where the wide-networks library is located.

Python jobs

We define python jobs which can be run with arguments from the command line in the directory jobs. Mainly, those jobs launch a training / val / test pipeline for a given model using the Lightning module, and the results are collected in a dictionary which is saved to a pickle file a the end of training for later examination. Additionally, metrics are logged in TensorBoard and can be visualized during training with the command

tensorboard --logdir=`your_experiment_dir`

We have written jobs to launch experiments on MNIST and CIFAR-10 with the fully connected version of different models such as muP (Yang & Hu 2021), IP-LLR, Naive-IP which can be found in jobs/abc_parameterizations. Arguments can be passed to those Python scripts through the command line, but they are optional and the default values will be used if the parameters of the script are not manually set. For example, the command

python3 jobs/abc_parameterizations/fc_muP_run.py --activation="relu" --n_steps=600 --dataset="mnist"

will launch a training / val / test pipeline with ReLU as the activation function, 600 SGD steps and the MNIST dataset. The other parameters of the run (e.g. the base learning rate and batch size) will have their default values. The jobs will automatically create a directory (and potentially subdirectories) for the experiment and save there the python logs, the tensorboard events and the results dictionary saved to a pickle file as well as the checkpoints saved for the network.

Visualizing results

To visualize the results after training for a given experiment, one can launch the notebook experiments-results.ipynb located in pytorch/notebooks/training/abc_parameterizations, and simply change the arguments in the "Set variables" cell to load the results from the corresponding experiment. Then running all the cells will produce (and save) some figures related to the training phase (e.g. loss vs. steps).

Owner
Karl Hajjar
PhD student at Laboratoire de Mathématiques d'Orsay
Karl Hajjar
Decensoring Hentai with Deep Neural Networks. Formerly named DeepMindBreak.

DeepCreamPy Decensoring Hentai with Deep Neural Networks. Formerly named DeepMindBreak. A deep learning-based tool to automatically replace censored a

616 Jan 06, 2023
This repository contains the implementation of the paper Contrastive Instance Association for 4D Panoptic Segmentation using Sequences of 3D LiDAR Scans

Contrastive Instance Association for 4D Panoptic Segmentation using Sequences of 3D LiDAR Scans This repository contains the implementation of the pap

Photogrammetry & Robotics Bonn 40 Dec 01, 2022
Interpolation-based reduced-order models

Interpolation-reduced-order-models Interpolation-based reduced-order models High-fidelity computational fluid dynamics (CFD) solutions are time consum

Donovan Blais 1 Jan 10, 2022
Normalization Calibration (NorCal) for Long-Tailed Object Detection and Instance Segmentation

NorCal Normalization Calibration (NorCal) for Long-Tailed Object Detection and Instance Segmentation On Model Calibration for Long-Tailed Object Detec

Tai-Yu (Daniel) Pan 24 Dec 25, 2022
Vikrant Deshpande 1 Nov 17, 2022
The most simple and minimalistic navigation dashboard.

Navigation This project follows a goal to have simple and lightweight dashboard with different links. I use it to have my own self-hosted service dash

Yaroslav 23 Dec 23, 2022
TensorFlow port of PyTorch Image Models (timm) - image models with pretrained weights.

TensorFlow-Image-Models Introduction Usage Models Profiling License Introduction TensorfFlow-Image-Models (tfimm) is a collection of image models with

Martins Bruveris 227 Dec 20, 2022
Implementation of "A Deep Learning Loss Function based on Auditory Power Compression for Speech Enhancement" by pytorch

This repository is used to suspend the results of our paper "A Deep Learning Loss Function based on Auditory Power Compression for Speech Enhancement"

ScorpioMiku 19 Sep 30, 2022
Keras implementations of Generative Adversarial Networks.

This repository has gone stale as I unfortunately do not have the time to maintain it anymore. If you would like to continue the development of it as

Erik Linder-Norén 8.9k Jan 04, 2023
House-GAN++: Generative Adversarial Layout Refinement Network towards Intelligent Computational Agent for Professional Architects

House-GAN++ Code and instructions for our paper: House-GAN++: Generative Adversarial Layout Refinement Network towards Intelligent Computational Agent

122 Dec 28, 2022
(AAAI 2021) Progressive One-shot Human Parsing

End-to-end One-shot Human Parsing This is the official repository for our two papers: Progressive One-shot Human Parsing (AAAI 2021) End-to-end One-sh

54 Dec 30, 2022
Most popular metrics used to evaluate object detection algorithms.

Most popular metrics used to evaluate object detection algorithms.

Rafael Padilla 4.4k Dec 25, 2022
Multiview 3D object detection on MultiviewC dataset through moft3d.

Voxelized 3D Feature Aggregation for Multiview Detection [arXiv] Multiview 3D object detection on MultiviewC dataset through VFA. Introduction We prop

Jiahao Ma 20 Dec 21, 2022
Segmentation Training Pipeline

Segmentation Training Pipeline This package is a part of Musket ML framework. Reasons to use Segmentation Pipeline Segmentation Pipeline was developed

Musket ML 52 Dec 12, 2022
Predicting lncRNA–protein interactions based on graph autoencoders and collaborative training

Predicting lncRNA–protein interactions based on graph autoencoders and collaborative training Code for our paper "Predicting lncRNA–protein interactio

zhanglabNKU 1 Nov 29, 2022
The Submission for SIMMC 2.0 Challenge 2021

The Submission for SIMMC 2.0 Challenge 2021 challenge website Requirements python 3.8.8 pytorch 1.8.1 transformers 4.8.2 apex for multi-gpu nltk Prepr

5 Jul 26, 2022
A unified 3D Transformer Pipeline for visual synthesis

Overview This is the official repo for the paper: NÜWA: Visual Synthesis Pre-training for Neural visUal World creAtion. NÜWA is a unified multimodal p

Microsoft 2.6k Jan 06, 2023
Bringing sanity to world of messed-up data

Sanitize sanitize is a Python module for making sure various things (e.g. HTML) are safe to use. It was originally written by Mark Pilgrim and is dist

Alireza Savand 63 Oct 26, 2021
LF-YOLO (Lighter and Faster YOLO) is used to detect defect of X-ray weld image.

This project is based on ultralytics/yolov3. LF-YOLO (Lighter and Faster YOLO) is used to detect defect of X-ray weld image. Download $ git clone http

26 Dec 13, 2022
An automated algorithm to extract the linear blend skinning (LBS) from a set of example poses

Dem Bones This repository contains an implementation of Smooth Skinning Decomposition with Rigid Bones, an automated algorithm to extract the Linear B

Electronic Arts 684 Dec 26, 2022