Official code repository for the publication "Latent Equilibrium: A unified learning theory for arbitrarily fast computation with arbitrarily slow neurons"

Overview

Latent Equilibrium: A unified learning theory for arbitrarily fast computation with arbitrarily slow neurons

This repository contains the code to reproduce the results of the NeurIPS 2021 submission "Latent Equilibrium: A unified learning theory for arbitrarily fast computation with arbitrarily slow neurons" (also available on arXiv).

Requirements

To install requirements:

pip install -r requirements.txt

Training & Evaluation

Code for FC MNIST experiments (Fig.2b and 4ac)

The code can be found in fig2b_fig4ac_mnist/src/.

Running the experiments: For example, in order to run all the experiments needed to reproduce Fig. 2b, execute:

cd fig2b_fig4ac_mnist/src/
/bin/bash 2b_jobs.sh

The results of each run, that is for example metrics, output and configurations, will be saved in fig2b_fig4ac_mnist/runs/{run_number}/.

For the experiment in Fig.4 replace 2b_jobs.sh with 4a_jobs.sh or 4c_jobs.sh respectively

The seeds chosen for these experiments were 42 69 12345 98765 38274 28374 42848 48393 83475 57381.

Code for HIGGS, MNIST and CIFAR10 with and without LE (Fig. 2cde).

The code can be found in fig2cde_higgs_mnist_cifar10.

The code configuration is integrated into the main files and only a few parameters are configured via argparse.

To run the code, check the respective submit_python_*_v100.sh file which contains examples and all run configurations for all seeds used.

The seeds chosen for these experiments were 1, 2, 3, 5, 7, 8, 13, 21, 34. (Fibonacci + lucky number 7), resulting in 9 seeds for each experiment.

Results can be found in the respective log file produced from the std out of the running code via python -u *_training.py > file.log.

Code for Dendritic Microcircuits with and without LE (Fig.3 and 5)

The code can be found in fig3fig5_dendritic_microcircuits/src/.

The experiments are configured using config files. All config files required for the production of the plotted results are in fig3fig5_dendritic_microcircuits/experiment_configs/. The naming scheme of the config files is as follows {task name}_{with LE or not}_tpres_{tpres in unit dt}.yaml where task name is bars (Fig.3) or mimic (Fig.5) and with LE or not is either le or orig.

For each run the results will be saved in fig3fig5_dendritic_microcircuits/experiment_results/{config file name}_{timestamp}/.

To run an experiment:

cd fig3fig5_dendritic_microcircuits/src/
python3 run_bars.py train ../experiment_configs/{chosen_config_file}

For the experiment in Fig.5 replace run_bars.py with run_single_mc.py

To plot the results of a run:

cd fig3fig5_dendritic_microcircuits/src/
python3 run_bars.py eval ../experiment_results/{results_dir_of_run_to_be_evaluated}

This will generate plots of the results (depending on how many variables you configured to be recorded, more or less plots can be generated) and save them in the respective results directory. Which plots are plotted is defined in run_X.py

Reproduce all data needed for Fig3:

For the results shown in Fig.3 all config files with the name bars_*.yaml need to be run for 10 different seeds (configurable in the config file). The seeds chosen for these experiments were 12345, 12346, 12347, 12348, 12349, 12350, 12351, 12352, 12353, 12354.

Contributing

📋 TODO: Pick a licence and describe how to contribute to your code repository.

Owner
Computational Neuroscience, University of Bern
Computational Neuroscience, University of Bern
A Light CNN for Deep Face Representation with Noisy Labels

A Light CNN for Deep Face Representation with Noisy Labels Citation If you use our models, please cite the following paper: @article{wulight, title=

Alfred Xiang Wu 715 Nov 05, 2022
Partial implementation of ODE-GAN technique from the paper Training Generative Adversarial Networks by Solving Ordinary Differential Equations

ODE GAN (Prototype) in PyTorch Partial implementation of ODE-GAN technique from the paper Training Generative Adversarial Networks by Solving Ordinary

Somshubra Majumdar 15 Feb 10, 2022
The Deep Learning with Julia book, using Flux.jl.

Deep Learning with Julia DL with Julia is a book about how to do various deep learning tasks using the Julia programming language and specifically the

Logan Kilpatrick 67 Dec 25, 2022
[ICCV'2021] Image Inpainting via Conditional Texture and Structure Dual Generation

[ICCV'2021] Image Inpainting via Conditional Texture and Structure Dual Generation

Xiefan Guo 122 Dec 11, 2022
How the Deep Q-learning method works and discuss the new ideas that makes the algorithm work

Deep Q-Learning Recommend papers The first step is to read and understand the method that you will implement. It was first introduced in a 2013 paper

1 Jan 25, 2022
Hierarchical Memory Matching Network for Video Object Segmentation (ICCV 2021)

Hierarchical Memory Matching Network for Video Object Segmentation Hongje Seong, Seoung Wug Oh, Joon-Young Lee, Seongwon Lee, Suhyeon Lee, Euntai Kim

Hongje Seong 72 Dec 14, 2022
Allele-specific pipeline for unbiased read mapping(WIP), QTL discovery(WIP), and allelic-imbalance analysis

WASP2 (Currently in pre-development): Allele-specific pipeline for unbiased read mapping(WIP), QTL discovery(WIP), and allelic-imbalance analysis Requ

McVicker Lab 2 Aug 11, 2022
This is the repository of the NeurIPS 2021 paper "Curriculum Disentangled Recommendation withNoisy Multi-feedback"

Curriculum_disentangled_recommendation This is the repository of the NeurIPS 2021 paper "Curriculum Disentangled Recommendation with Noisy Multi-feedb

14 Dec 20, 2022
Real-Time-Student-Attendence-System - Real Time Student Attendence System

Real-Time-Student-Attendence-System The Student Attendance Management System Pro

Rounak Das 1 Feb 15, 2022
Improving Non-autoregressive Generation with Mixup Training

MIST Training MIST TRAIN_FILE=/your/path/to/train.json VALID_FILE=/your/path/to/valid.json OUTPUT_DIR=/your/path/to/save_checkpoints CACHE_DIR=/your/p

7 Nov 22, 2022
Pytorch code for semantic segmentation using ERFNet

ERFNet (PyTorch version) This code is a toolbox that uses PyTorch for training and evaluating the ERFNet architecture for semantic segmentation. For t

Edu 394 Jan 01, 2023
JudeasRx - graphical app for doing personalized causal medicine using the methods invented by Judea Pearl et al.

JudeasRX Instructions Read the references given in the Theory and Notation section below Fire up the Jupyter Notebook judeas-rx.ipynb The notebook dra

Robert R. Tucci 19 Nov 07, 2022
Steerable discovery of neural audio effects

Steerable discovery of neural audio effects Christian J. Steinmetz and Joshua D. Reiss Abstract Applications of deep learning for audio effects often

Christian J. Steinmetz 182 Dec 29, 2022
Image inpainting using Gaussian Mixture Models

dmfa_inpainting Source code for: MisConv: Convolutional Neural Networks for Missing Data (to be published at WACV 2022) Estimating conditional density

Marcin Przewięźlikowski 8 Oct 09, 2022
Official code of our work, AVATAR: A Parallel Corpus for Java-Python Program Translation.

AVATAR Official code of our work, AVATAR: A Parallel Corpus for Java-Python Program Translation. AVATAR stands for jAVA-pyThon progrAm tRanslation. AV

Wasi Ahmad 26 Dec 03, 2022
Decorator for PyMC3

sampled Decorator for reusable models in PyMC3 Provides syntactic sugar for reusable models with PyMC3. This lets you separate creating a generative m

Colin 50 Oct 08, 2021
Python module providing a framework to trace individual edges in an image using Gaussian process regression.

Edge Tracing using Gaussian Process Regression Repository storing python module which implements a framework to trace individual edges in an image usi

Jamie Burke 7 Dec 27, 2022
SmoothGrad implementation in PyTorch

SmoothGrad implementation in PyTorch PyTorch implementation of SmoothGrad: removing noise by adding noise. Vanilla Gradients SmoothGrad Guided backpro

SSKH 143 Jan 05, 2023
Finding an Unsupervised Image Segmenter in each of your Deep Generative Models

Finding an Unsupervised Image Segmenter in each of your Deep Generative Models Description Recent research has shown that numerous human-interpretable

Luke Melas-Kyriazi 61 Oct 17, 2022
🇰🇷 Text to Image in Korean

KoDALLE Utilizing pretrained language model’s token embedding layer and position embedding layer as DALLE’s text encoder. Background Training DALLE mo

HappyFace 74 Sep 22, 2022