Open-L2O: A Comprehensive and Reproducible Benchmark for Learning to Optimize Algorithms

Overview

Open-L2O

This repository establishes the first comprehensive benchmark efforts of existing learning to optimize (L2O) approaches on a number of problems and settings. We release our software implementation and data as the Open-L2O package, for reproducible research and fair benchmarking in the L2O field. [Paper]

License: MIT

Overview

What is learning to optimize (L2O)?

L2O (Learning to optimize) aims to replace manually designed analytic optimization algorithms (SGD, RMSProp, Adam, etc.) with learned update rules.

How does L2O work?

L2O serves as functions that can be fit from data. L2O gains experience from training optimization tasks in a principled and automatic way.

What can L2O do for you?

L2O is particularly suitable for solving a certain type of optimization over a specific distribution of data repeatedly. In comparison to classic methods, L2O is shown to find higher-quality solutions and/or with much faster convergence speed for many problems.

Open questions for research?

  • There are significant theoretical and practicality gaps between manually designed optimizers and existing L2O models.

Main Results

Learning to optimize sparse recovery

Learning to optimize Lasso functions

Learning to optimize non-convex Rastrigin functions

Learning to optimize neural networks

Supported Model-base Learnable Optimizers

All codes are available at here.

  1. LISTA (feed-forward form) from Learning fast approximations of sparse coding [Paper]
  2. LISTA-CP from Theoretical Linear Convergence of Unfolded ISTA and its Practical Weights and Thresholds [Paper]
  3. LISTA-CPSS from Theoretical Linear Convergence of Unfolded ISTA and its Practical Weights and Thresholds [Paper]
  4. LFISTA from Understanding Trainable Sparse Coding via Matrix Factorization [Paper]
  5. LAMP from AMP-Inspired Deep Networks for Sparse Linear Inverse Problems [Paper]
  6. ALISTA from ALISTA: Analytic Weights Are As Good As Learned Weights in LISTA [Paper]
  7. GLISTA from Sparse Coding with Gated Learned ISTA [Paper]

Supported Model-free Learnable Optimizers

  1. L2O-DM from Learning to learn by gradient descent by gradient descent [Paper] [Code]
  2. L2O-RNNProp Learning Gradient Descent: Better Generalization and Longer Horizons from [Paper] [Code]
  3. L2O-Scale from Learned Optimizers that Scale and Generalize [Paper] [Code]
  4. L2O-enhanced from Training Stronger Baselines for Learning to Optimize [Paper] [Code]
  5. L2O-Swarm from Learning to Optimize in Swarms [Paper] [Code]
  6. L2O-Jacobian from HALO: Hardware-Aware Learning to Optimize [Paper] [Code]
  7. L2O-Minmax from Learning A Minimax Optimizer: A Pilot Study [Paper] [Code]

Supported Optimizees

Convex Functions:

  • Quadratic
  • Lasso

Non-convex Functions:

  • Rastrigin

Minmax Functions:

  • Saddle
  • Rotated Saddle
  • Seesaw
  • Matrix Game

Neural Networks:

  • MLPs on MNIST
  • ConvNets on MNIST and CIFAR-10
  • LeNet
  • NAS searched archtectures

Other Resources

  • This is a Pytorch implementation of L2O-DM. [Code]
  • This is the original L2O-Swarm repository. [Code]
  • This is the original L2O-Jacobian repository. [Code]

Future Works

  • TF2.0 Implementated toolbox v2 with a unified framework and lib dependency.

Cite

@misc{chen2021learning,
      title={Learning to Optimize: A Primer and A Benchmark}, 
      author={Tianlong Chen and Xiaohan Chen and Wuyang Chen and Howard Heaton and Jialin Liu and Zhangyang Wang and Wotao Yin},
      year={2021},
      eprint={2103.12828},
      archivePrefix={arXiv},
      primaryClass={math.OC}
}
Owner
VITA
Visual Informatics Group @ University of Texas at Austin
VITA
Implementation of PersonaGPT Dialog Model

PersonaGPT An open-domain conversational agent with many personalities PersonaGPT is an open-domain conversational agent cpable of decoding personaliz

ILLIDAN Lab 42 Jan 01, 2023
Back to Basics: Efficient Network Compression via IMP

Back to Basics: Efficient Network Compression via IMP Authors: Max Zimmer, Christoph Spiegel, Sebastian Pokutta This repository contains the code to r

IOL Lab @ ZIB 1 Nov 19, 2021
Demo project for real time anomaly detection using kafka and python

kafkaml-anomaly-detection Project for real time anomaly detection using kafka and python It's assumed that zookeeper and kafka are running in the loca

Rodrigo Arenas 36 Dec 12, 2022
Code to reproduce the experiments in the paper "Transformer Based Multi-Source Domain Adaptation" (EMNLP 2020)

Transformer Based Multi-Source Domain Adaptation Dustin Wright and Isabelle Augenstein To appear in EMNLP 2020. Read the preprint: https://arxiv.org/a

CopeNLU 36 Dec 05, 2022
PyTorch implementation of SwAV (Swapping Assignments between Views)

Unsupervised Learning of Visual Features by Contrasting Cluster Assignments This code provides a PyTorch implementation and pretrained models for SwAV

Meta Research 1.7k Jan 04, 2023
This repository contains PyTorch models for SpecTr (Spectral Transformer).

SpecTr: Spectral Transformer for Hyperspectral Pathology Image Segmentation This repository contains PyTorch models for SpecTr (Spectral Transformer).

Boxiang Yun 45 Dec 13, 2022
Deep learning with TensorFlow and earth observation data.

Deep Learning with TensorFlow and EO Data Complete file set for Jupyter Book Autor: Development Seed Date: 04 October 2021 ISBN: (to come) Notebook tu

Development Seed 20 Nov 16, 2022
GeneralOCR is open source Optical Character Recognition based on PyTorch.

Introduction GeneralOCR is open source Optical Character Recognition based on PyTorch. It makes a fidelity and useful tool to implement SOTA models on

57 Dec 29, 2022
Transfer Learning for Pose Estimation of Illustrated Characters

bizarre-pose-estimator Transfer Learning for Pose Estimation of Illustrated Characters Shuhong Chen *, Matthias Zwicker * WACV2022 [arxiv] [video] [po

Shuhong Chen 142 Dec 28, 2022
[2021 MultiMedia] CONQUER: Contextual Query-aware Ranking for Video Corpus Moment Retrieval

CONQUER: Contexutal Query-aware Ranking for Video Corpus Moment Retreival PyTorch implementation of CONQUER: Contexutal Query-aware Ranking for Video

Hou zhijian 23 Dec 26, 2022
Tackling data scarcity in Speech Translation using zero-shot multilingual Machine Translation techniques

Tackling data scarcity in Speech Translation using zero-shot multilingual Machine Translation techniques This repository is derived from the NMTGMinor

Tu Anh Dinh 1 Sep 07, 2022
This codebase is the official implementation of Test-Time Classifier Adjustment Module for Model-Agnostic Domain Generalization (NeurIPS2021, Spotlight)

Test-Time Classifier Adjustment Module for Model-Agnostic Domain Generalization This codebase is the official implementation of Test-Time Classifier A

47 Dec 28, 2022
A framework for joint super-resolution and image synthesis, without requiring real training data

SynthSR This repository contains code to train a Convolutional Neural Network (CNN) for Super-resolution (SR), or joint SR and data synthesis. The met

83 Jan 01, 2023
"SOLQ: Segmenting Objects by Learning Queries", SOLQ is an end-to-end instance segmentation framework with Transformer.

SOLQ: Segmenting Objects by Learning Queries This repository is an official implementation of the paper SOLQ: Segmenting Objects by Learning Queries.

MEGVII Research 179 Jan 02, 2023
Related resources for our EMNLP 2021 paper

Plan-then-Generate: Controlled Data-to-Text Generation via Planning Authors: Yixuan Su, David Vandyke, Sihui Wang, Yimai Fang, and Nigel Collier Code

Yixuan Su 61 Jan 03, 2023
Dynamic Graph Event Detection

DyGED Dynamic Graph Event Detection Get Started pip install -r requirements.txt TODO Paper link to arxiv, and how to cite. Twitter Weather dataset tra

Mert Koşan 3 May 09, 2022
Code and data form the paper BERT Got a Date: Introducing Transformers to Temporal Tagging

BERT Got a Date: Introducing Transformers to Temporal Tagging Satya Almasian*, Dennis Aumiller*, and Michael Gertz Heidelberg University Contact us vi

54 Dec 04, 2022
Kaggle competition: Springleaf Marketing Response

PruebaEnel Prueba Kaggle-Springleaf-master Prueba Kaggle-Springleaf Kaggle competition: Springleaf Marketing Response Competencia de Kaggle: Marketing

1 Feb 09, 2022
Official code repository for the work: "The Implicit Values of A Good Hand Shake: Handheld Multi-Frame Neural Depth Refinement"

Handheld Multi-Frame Neural Depth Refinement This is the official code repository for the work: The Implicit Values of A Good Hand Shake: Handheld Mul

55 Dec 14, 2022
Koç University deep learning framework.

Knet Knet (pronounced "kay-net") is the Koç University deep learning framework implemented in Julia by Deniz Yuret and collaborators. It supports GPU

1.4k Dec 31, 2022