Official code repository for the publication "Latent Equilibrium: A unified learning theory for arbitrarily fast computation with arbitrarily slow neurons"

Overview

Latent Equilibrium: A unified learning theory for arbitrarily fast computation with arbitrarily slow neurons

This repository contains the code to reproduce the results of the NeurIPS 2021 submission "Latent Equilibrium: A unified learning theory for arbitrarily fast computation with arbitrarily slow neurons" (also available on arXiv).

Requirements

To install requirements:

pip install -r requirements.txt

Training & Evaluation

Code for FC MNIST experiments (Fig.2b and 4ac)

The code can be found in fig2b_fig4ac_mnist/src/.

Running the experiments: For example, in order to run all the experiments needed to reproduce Fig. 2b, execute:

cd fig2b_fig4ac_mnist/src/
/bin/bash 2b_jobs.sh

The results of each run, that is for example metrics, output and configurations, will be saved in fig2b_fig4ac_mnist/runs/{run_number}/.

For the experiment in Fig.4 replace 2b_jobs.sh with 4a_jobs.sh or 4c_jobs.sh respectively

The seeds chosen for these experiments were 42 69 12345 98765 38274 28374 42848 48393 83475 57381.

Code for HIGGS, MNIST and CIFAR10 with and without LE (Fig. 2cde).

The code can be found in fig2cde_higgs_mnist_cifar10.

The code configuration is integrated into the main files and only a few parameters are configured via argparse.

To run the code, check the respective submit_python_*_v100.sh file which contains examples and all run configurations for all seeds used.

The seeds chosen for these experiments were 1, 2, 3, 5, 7, 8, 13, 21, 34. (Fibonacci + lucky number 7), resulting in 9 seeds for each experiment.

Results can be found in the respective log file produced from the std out of the running code via python -u *_training.py > file.log.

Code for Dendritic Microcircuits with and without LE (Fig.3 and 5)

The code can be found in fig3fig5_dendritic_microcircuits/src/.

The experiments are configured using config files. All config files required for the production of the plotted results are in fig3fig5_dendritic_microcircuits/experiment_configs/. The naming scheme of the config files is as follows {task name}_{with LE or not}_tpres_{tpres in unit dt}.yaml where task name is bars (Fig.3) or mimic (Fig.5) and with LE or not is either le or orig.

For each run the results will be saved in fig3fig5_dendritic_microcircuits/experiment_results/{config file name}_{timestamp}/.

To run an experiment:

cd fig3fig5_dendritic_microcircuits/src/
python3 run_bars.py train ../experiment_configs/{chosen_config_file}

For the experiment in Fig.5 replace run_bars.py with run_single_mc.py

To plot the results of a run:

cd fig3fig5_dendritic_microcircuits/src/
python3 run_bars.py eval ../experiment_results/{results_dir_of_run_to_be_evaluated}

This will generate plots of the results (depending on how many variables you configured to be recorded, more or less plots can be generated) and save them in the respective results directory. Which plots are plotted is defined in run_X.py

Reproduce all data needed for Fig3:

For the results shown in Fig.3 all config files with the name bars_*.yaml need to be run for 10 different seeds (configurable in the config file). The seeds chosen for these experiments were 12345, 12346, 12347, 12348, 12349, 12350, 12351, 12352, 12353, 12354.

Contributing

📋 TODO: Pick a licence and describe how to contribute to your code repository.

Owner
Computational Neuroscience, University of Bern
Computational Neuroscience, University of Bern
The official project of SimSwap (ACM MM 2020)

SimSwap: An Efficient Framework For High Fidelity Face Swapping Proceedings of the 28th ACM International Conference on Multimedia The official reposi

Six_God 2.6k Jan 08, 2023
Code for Understanding Pooling in Graph Neural Networks

Select, Reduce, Connect This repository contains the code used for the experiments of: "Understanding Pooling in Graph Neural Networks" Setup Install

Daniele Grattarola 37 Dec 13, 2022
An University Project of Quera Web Crawling.

WebCrawlerProject An University Project of Quera Web Crawling. خزشگر اینستاگرام در این پروژه شما باید با استفاده از کتابخانه های زیر یک خزشگر اینستاگر

Mahdi 3 Aug 12, 2022
Imbalanced Gradients: A Subtle Cause of Overestimated Adversarial Robustness

Imbalanced Gradients: A Subtle Cause of Overestimated Adversarial Robustness Code for Paper "Imbalanced Gradients: A Subtle Cause of Overestimated Adv

Hanxun Huang 11 Nov 30, 2022
Code for "PVNet: Pixel-wise Voting Network for 6DoF Pose Estimation" CVPR 2019 oral

Good news! We release a clean version of PVNet: clean-pvnet, including how to train the PVNet on the custom dataset. Use PVNet with a detector. The tr

ZJU3DV 722 Dec 27, 2022
Synthetic LiDAR sequential point cloud dataset with point-wise annotations

SynLiDAR dataset: Learning From Synthetic LiDAR Sequential Point Cloud This is official repository of the SynLiDAR dataset. For technical details, ple

78 Dec 27, 2022
Pip-package for trajectory benchmarking from "Be your own Benchmark: No-Reference Trajectory Metric on Registered Point Clouds", ECMR'21

Map Metrics for Trajectory Quality Map metrics toolkit provides a set of metrics to quantitatively evaluate trajectory quality via estimating consiste

Mobile Robotics Lab. at Skoltech 31 Oct 28, 2022
Code repository for the paper: Hierarchical Kinematic Probability Distributions for 3D Human Shape and Pose Estimation from Images in the Wild (ICCV 2021)

Hierarchical Kinematic Probability Distributions for 3D Human Shape and Pose Estimation from Images in the Wild Akash Sengupta, Ignas Budvytis, Robert

Akash Sengupta 149 Dec 14, 2022
Code for testing convergence rates of Lipschitz learning on graphs

📈 LipschitzLearningRates The code in this repository reproduces the experimental results on convergence rates for k-nearest neighbor graph infinity L

2 Dec 20, 2021
Add gui for YoloV5 using PyQt5

HEAD 更新2021.08.16 **添加图片和视频保存功能: 1.图片和视频按照当前系统时间进行命名 2.各自检测结果存放入output文件夹 3.摄像头检测的默认设备序号更改为0,减少调试报错 温馨提示: 1.项目放置在全英文路径下,防止项目报错 2.默认使用cpu进行检测,自

Ruihao Wang 65 Dec 27, 2022
NLP made easy

GluonNLP: Your Choice of Deep Learning for NLP GluonNLP is a toolkit that helps you solve NLP problems. It provides easy-to-use tools that helps you l

Distributed (Deep) Machine Learning Community 2.5k Jan 04, 2023
A mini library for Policy Gradients with Parameter-based Exploration, with reference implementation of the ClipUp optimizer from NNAISENSE.

PGPElib A mini library for Policy Gradients with Parameter-based Exploration [1] and friends. This library serves as a clean re-implementation of the

NNAISENSE 56 Jan 01, 2023
Official Implementation of "LUNAR: Unifying Local Outlier Detection Methods via Graph Neural Networks"

LUNAR Official Implementation of "LUNAR: Unifying Local Outlier Detection Methods via Graph Neural Networks" Adam Goodge, Bryan Hooi, Ng See Kiong and

Adam Goodge 25 Dec 28, 2022
Segment axon and myelin from microscopy data using deep learning

Segment axon and myelin from microscopy data using deep learning. Written in Python. Using the TensorFlow framework. Based on a convolutional neural network architecture. Pixels are classified as eit

NeuroPoly 103 Nov 29, 2022
First-Order Probabilistic Programming Language

FOPPL: A First-Order Probabilistic Programming Language This is an implementation of FOPPL, an S-expression based probabilistic programming language d

Renato Costa 23 Dec 20, 2022
ilpyt: imitation learning library with modular, baseline implementations in Pytorch

ilpyt The imitation learning toolbox (ilpyt) contains modular implementations of common deep imitation learning algorithms in PyTorch, with unified in

The MITRE Corporation 11 Nov 17, 2022
Deep Learning as a Cloud API Service.

Deep API Deep Learning as Cloud APIs. This project provides pre-trained deep learning models as a cloud API service. A web interface is available as w

Wu Han 4 Jan 06, 2023
The PASS dataset: pretrained models and how to get the data - PASS: Pictures without humAns for Self-Supervised Pretraining

The PASS dataset: pretrained models and how to get the data - PASS: Pictures without humAns for Self-Supervised Pretraining

Yuki M. Asano 249 Dec 22, 2022
This repository is an implementation of paper : Improving the Training of Graph Neural Networks with Consistency Regularization

CRGNN Paper : Improving the Training of Graph Neural Networks with Consistency Regularization Environments Implementing environment: GeForce RTX™ 3090

THUDM 28 Dec 09, 2022
Source code for CAST - Crisis Domain Adaptation Using Sequence-to-sequence Transformers (Accepted to ISCRAM 2021, CorePaper).

Source code for CAST: Crisis Domain Adaptation UsingSequence-to-sequenceTransformers (Paper, BibTeX, Accepted to ISCRAM 2021, CorePaper) Quick start D

Congcong Wang 0 Jul 14, 2021