ViViT: Curvature access through the generalized Gauss-Newton's low-rank structure

Related tags

Deep Learningvivit
Overview

[ πŸ‘· πŸ— πŸ‘· πŸ— Coming soon! Official release with improved docs. Stay tuned. πŸ‘· πŸ— πŸ‘· πŸ— ]

ViViT: Curvature access through the generalized Gauss-Newton's low-rank structure

Python 3.7+ [tests]

ViViT is a collection of numerical tricks to efficiently access curvature from the generalized Gauss-Newton (GGN) matrix based on its low-rank structure. Provided functionality includes computing

  • GGN eigenvalues
  • GGN eigenpairs (eigenvalues + eigenvector)
  • 1Λ’α΅—- and 2ⁿᡈ-order directional derivatives along GGN eigenvectors
  • Newton steps

These operations can also further approximate the GGN to reduce cost via sub-sampling, Monte-Carlo approximation, and block-diagonal approximation.

How does it work? ViViT uses and extends BackPACK for PyTorch. The described functionality is realized through a combination of existing and new BackPACK extensions and hooks into its backpropagation.

Installation

πŸ‘· πŸ— πŸ‘· πŸ— The PyPI release is coming soon. πŸ‘· πŸ— πŸ‘· πŸ—

For now, you need to install from GitHub via

pip install vivit-for-pytorch@git+https://github.com/f-dangel/vivit.git#egg=vivit-for-pytorch

Examples

πŸ‘· πŸ— πŸ‘· πŸ— Coming soon! πŸ‘· πŸ— πŸ‘· πŸ—

How to cite

If you are using ViViT, consider citing the paper

@misc{dangel2022vivit,
      title={{ViViT}: Curvature access through the generalized Gauss-Newton's low-rank structure},
      author={Felix Dangel and Lukas Tatzel and Philipp Hennig},
      year={2022},
      eprint={2106.02624},
      archivePrefix={arXiv},
      primaryClass={cs.LG}
}
Comments
  • [ADD] Warn about instabilities if eigenvalues are small

    [ADD] Warn about instabilities if eigenvalues are small

    The directional gradient computation and transformation of the Newton step from Gram space into parameter space require division by the square root of the direction's eigenvalue. This is unstable if the eigenvalue is close to zero.

    opened by f-dangel 1
  • [ADD] Clean `DirectionalDampedNewtonComputation`

    [ADD] Clean `DirectionalDampedNewtonComputation`

    Adds directionally damped Newton step computation with cleaned up API.

    • Fixes a bug in the eigenvalue criterion in the tests. It always picked one more eigenvalue than specified.
    opened by f-dangel 1
  • [DOC] Add NTK example

    [DOC] Add NTK example

    Adds an example inspired by the functorch tutorial on NTKs. It demonstrates how to use vivit to compute empirical NTK matrices and makes a comparison with the functorch implementation.

    opened by f-dangel 1
  • [ADD] Simplify `DirectionalDerivatives` API

    [ADD] Simplify `DirectionalDerivatives` API

    Exotic features, like using different GGNs to compute directions and directional curvatures, as well as full control of which intermediate buffers to keep, have been deprecated in favor of a simpler API.

    • Remove Newton step computation for now as it was internally relying on DirectionalDerivatives
    • Remove many utilities and associated tests from the exotic features
    • Forbid duplicate indices in subsampling
    • Always delete intermediate buffers other than the target quantities
    opened by f-dangel 1
  • [DOC] Set up `sphinx` and RTD

    [DOC] Set up `sphinx` and RTD

    This PR adds a scaffold for the doc at https://vivit.readthedocs.io/en/latest/. Code examples are integrated via sphinx-gallery (I added a preliminary logo). Pull requests are built by the CI.

    To build the docs, run make docs. You need to install the dependencies first, for example using pip install -e .[docs].

    opened by f-dangel 1
  • Calculate Parameter Space Values of GGN Eigenvectors

    Calculate Parameter Space Values of GGN Eigenvectors

    The docs show how to calculate the gram matrix eigenvectors and the paper articulates that to translate from 'gram space' to parameter space we just need to multiply by the 'V' matrix.

    What's the easiest way of implementing this?

    question 
    opened by lk-wq 1
  • Detect loss function's `reduction`, error if unsupported

    Detect loss function's `reduction`, error if unsupported

    For now, the library only supports reduction='mean'. We rely on the user to use this reduction and raise awareness about this point in the documentation. It would be better to automatically have the library detect the reduction and error if it is unsupported.

    This can be done via a hook into BackPACK.

    • [ ] Implement hook that determines the loss function reduction during backpropagation
    • [ ] Integrate the above hook into the *Computation and raise an exception if the reduction is not supported
    • [ ] Remove the comments about supported reductions in the documentation
    enhancement 
    opened by f-dangel 0
Releases(1.0.0)
Owner
Felix Dangel
Machine Learning PhD student at the University of TΓΌbingen and the Max Planck Institute for Intelligent Systems.
Felix Dangel
The Pytorch implementation for "Video-Text Pre-training with Learned Regions"

Region_Learner The Pytorch implementation for "Video-Text Pre-training with Learned Regions" (arxiv) We are still cleaning up the code further and pre

Rui Yan 0 Mar 20, 2022
Interpolation-based reduced-order models

Interpolation-reduced-order-models Interpolation-based reduced-order models High-fidelity computational fluid dynamics (CFD) solutions are time consum

Donovan Blais 1 Jan 10, 2022
This repository includes code of my study about Asynchronous in Frequency domain of GAN images.

Exploring the Asynchronous of the Frequency Spectra of GAN-generated Facial Images Binh M. Le & Simon S. Woo, "Exploring the Asynchronous of the Frequ

4 Aug 06, 2022
Sequential Model-based Algorithm Configuration

SMAC v3 Project Copyright (C) 2016-2018 AutoML Group Attention: This package is a reimplementation of the original SMAC tool (see reference below). Ho

AutoML-Freiburg-Hannover 778 Jan 05, 2023
Fast and accurate optimisation for registration with little learningconvexadam

convexAdam Learn2Reg 2021 Submission Fast and accurate optimisation for registration with little learning Excellent results on Learn2Reg 2021 challeng

17 Dec 06, 2022
PyTorch implementation for MINE: Continuous-Depth MPI with Neural Radiance Fields

MINE: Continuous-Depth MPI with Neural Radiance Fields Project Page | Video PyTorch implementation for our ICCV 2021 paper. MINE: Towards Continuous D

Zijian Feng 325 Dec 29, 2022
AI grand challenge 2020 Repo (Speech Recognition Track)

KorBERTλ₯Ό ν™œμš©ν•œ ν•œκ΅­μ–΄ ν…μŠ€νŠΈ 기반 μœ„ν˜‘ 상황인지(2020 인곡지λŠ₯ κ·Έλžœλ“œ μ±Œλ¦°μ§€) λ³Έ ν”„λ‘œμ νŠΈλŠ” ETRIμ—μ„œ 제곡된 ν•œκ΅­μ–΄ korBERT λͺ¨λΈμ„ ν™œμš©ν•˜μ—¬ 폭λ ₯ 기반 ν•œκ΅­μ–΄ ν…μŠ€νŠΈλ₯Ό λΆ„λ₯˜ν•˜λŠ” λ‹€μ–‘ν•œ λΆ„λ₯˜ λͺ¨λΈλ“€μ„ μ œκ³΅ν•©λ‹ˆλ‹€. λ³Έ κ°œλ°œμžλ“€μ΄ μ°Έμ—¬ν•œ 2020 인곡지

Young-Seok Choi 23 Jan 25, 2022
A PyTorch library for Vision Transformers

VFormer A PyTorch library for Vision Transformers Getting Started Read the contributing guidelines in CONTRIBUTING.rst to learn how to start contribut

Society for Artificial Intelligence and Deep Learning 142 Nov 28, 2022
Library for fast text representation and classification.

fastText fastText is a library for efficient learning of word representations and sentence classification. Table of contents Resources Models Suppleme

Facebook Research 24.1k Jan 01, 2023
Official Implementation of HRDA: Context-Aware High-Resolution Domain-Adaptive Semantic Segmentation

HRDA: Context-Aware High-Resolution Domain-Adaptive Semantic Segmentation by Lukas Hoyer, Dengxin Dai, and Luc Van Gool [Arxiv] [Paper] Overview Unsup

Lukas Hoyer 149 Dec 28, 2022
Rethinking Transformer-based Set Prediction for Object Detection

Rethinking Transformer-based Set Prediction for Object Detection Here are the code for the ICCV paper. The code is adapted from Detectron2 and AdelaiD

Zhiqing Sun 62 Dec 03, 2022
Dynamic Multi-scale Filters for Semantic Segmentation (DMNet ICCV'2019)

Dynamic Multi-scale Filters for Semantic Segmentation (DMNet ICCV'2019) Introduction Official implementation of Dynamic Multi-scale Filters for Semant

23 Oct 21, 2022
[ICML 2021] Break-It-Fix-It: Learning to Repair Programs from Unlabeled Data

Break-It-Fix-It: Learning to Repair Programs from Unlabeled Data This repo provides the source code & data of our paper: Break-It-Fix-It: Unsupervised

Michihiro Yasunaga 86 Nov 30, 2022
This repository includes the official project for the paper: TransMix: Attend to Mix for Vision Transformers.

TransMix: Attend to Mix for Vision Transformers This repository includes the official project for the paper: TransMix: Attend to Mix for Vision Transf

Jie-Neng Chen 130 Jan 01, 2023
Official repository of the AAAI'2022 paper "Contrast and Generation Make BART a Good Dialogue Emotion Recognizer"

CoG-BART Contrast and Generation Make BART a Good Dialogue Emotion Recognizer Quick Start: To run the model on test sets of four datasets, Download th

39 Dec 24, 2022
A PyTorch Implementation of SphereFace.

SphereFace A PyTorch Implementation of SphereFace. The code can be trained on CASIA-Webface and the best accuracy on LFW is 99.22%. SphereFace: Deep H

carwin 685 Dec 09, 2022
Original code for "Zero-Shot Domain Adaptation with a Physics Prior"

Zero-Shot Domain Adaptation with a Physics Prior [arXiv] [sup. material] - ICCV 2021 Oral paper, by Attila Lengyel, Sourav Garg, Michael Milford and J

Attila Lengyel 40 Dec 21, 2022
Implementation of MA-Trace - a general-purpose multi-agent RL algorithm for cooperative environments.

Off-Policy Correction For Multi-Agent Reinforcement Learning This repository is the official implementation of Off-Policy Correction For Multi-Agent R

4 Aug 18, 2022
OneShot Learning-based hotword detection.

EfficientWord-Net Hotword detection based on one-shot learning Home assistants require special phrases called hotwords to get activated (eg:"ok google

ANT-BRaiN 102 Dec 25, 2022
Boost learning for GNNs from the graph structure under challenging heterophily settings. (NeurIPS'20)

Beyond Homophily in Graph Neural Networks: Current Limitations and Effective Designs Jiong Zhu, Yujun Yan, Lingxiao Zhao, Mark Heimann, Leman Akoglu,

GEMS Lab: Graph Exploration & Mining at Scale, University of Michigan 70 Dec 18, 2022