Einshape: DSL-based reshaping library for JAX and other frameworks.

Related tags

Deep Learningeinshape
Overview

Einshape: DSL-based reshaping library for JAX and other frameworks.

The jnp.einsum op provides a DSL-based unified interface to matmul and tensordot ops. This einshape library is designed to offer a similar DSL-based approach to unifying reshape, squeeze, expand_dims, and transpose operations.

Some examples:

  • einshape("n->n111", x) is equivalent to expand_dims(x, axis=1) three times
  • einshape("a1b11->ab", x) is equivalent to squeeze(x, axis=[1,3,4])
  • einshape("nhwc->nchw", x) is equivalent to transpose(x, perm=[0,3,1,2])
  • einshape("mnhwc->(mn)hwc", x) is equivalent to a reshape combining the two leading dimensions
  • einshape("(mn)hwc->mnhwc", x, n=batch_size) is equivalent to a reshape splitting the leading dimension into two, using kwargs (m or n or both) to supply the necessary additional shape information
  • einshape("mn...->(mn)...", x) combines the two leading dimensions without knowing the rank of x
  • einshape("n...->n(...)", x) performs a 'batch flatten'
  • einshape("ij->ijk", x, k=3) inserts a trailing dimension and tiles along it
  • einshape("ij->i(nj)", x, n=3) tiles along the second dimension

See jax_ops.py for the JAX implementation of the einshape function. Alternatively, the parser and engine are exposed in engine.py allowing analogous implementations in TensorFlow or other frameworks.

Installation

Einshape can be installed with the following command:

pip3 install git+https://github.com/deepmind/einshape

Einshape will work with either Jax or TensorFlow. To allow for that it does not list either as a requirement, so it is necessary to ensure that Jax or TensorFlow is installed separately.

Usage

Jax version:

(ij)", a) # b is [1, 2, 3, 4] ">
from einshape import jax_einshape as einshape
from jax import numpy as jnp

a = jnp.array([[1, 2], [3, 4]])
b = einshape("ij->(ij)", a)
# b is [1, 2, 3, 4]

TensorFlow version:

(ij)", a) # b is [1, 2, 3, 4] ">
from einshape import tf_einshape as einshape
import tensorflow as tf

a = tf.constant([[1, 2], [3, 4]])
b = einshape("ij->(ij)", a)
# b is [1, 2, 3, 4]

Understanding einshape equations

An einshape equation is always of the form {lhs}->{rhs}, where {lhs} and {rhs} both stand for expressions. An expression represents the axes of an array; the relationship between two expressions illustrate how an array should be transformed.

An expression is a non-empty sequence of the following elements:

Index name

A single letter a-z, representing one axis of an array.

For example, the expressions ab and jq both represent an array of rank 2.

Every index name that is present on the left-hand side of an equation must also be present on the right-hand side. So, ab->a is not a valid equation, but a->ba is valid (and will tile a vector b times).

Ellipsis

..., representing any axes of an array that are not otherwise represented in the expression. This is similar to the use of -1 as an axis in a reshape operation.

For example, a...b can represent any array of rank 2 or more: a will refer to the first axis and b to the last. The equation ...ab->...ba will swap the last two axes of an array.

An expression may not include more than one ellipsis (because that would be ambiguous). Like an index name, an ellipsis must be present in both halves of an equation or neither.

Group

({components}), where components is a sequence of index names and ellipsis elements. The entire group corresponds to a single axis of the array; the group's components represent factors of the axis size. This can be used to reshape an axis into many axes. All the factors except at most one must be specified using keyword arguments.

For example, einshape('(ab)->ab', x, a=10) reshapes an array of rank 1 (whose length must be a multiple of 10) into an array of rank 2 (whose first dimension is of length 10).

Groups may not be nested.

Unit

The digit 1, representing a single axis of length 1. This is useful for expanding and squeezing unit dimensions.

For example, the equation 1...->... squeezes a leading axis (which must have length one).

Disclaimer

This is not an official Google product.

Einshape Logo

Owner
DeepMind
DeepMind
🔥🔥High-Performance Face Recognition Library on PaddlePaddle & PyTorch🔥🔥

face.evoLVe: High-Performance Face Recognition Library based on PaddlePaddle & PyTorch Evolve to be more comprehensive, effective and efficient for fa

Zhao Jian 3.1k Jan 02, 2023
MoCoGAN: Decomposing Motion and Content for Video Generation

MoCoGAN: Decomposing Motion and Content for Video Generation This repository contains an implementation and further details of MoCoGAN: Decomposing Mo

Sergey Tulyakov 514 Dec 18, 2022
Tensors and Dynamic neural networks in Python with strong GPU acceleration

PyTorch is a Python package that provides two high-level features: Tensor computation (like NumPy) with strong GPU acceleration Deep neural networks b

61.4k Jan 04, 2023
Code for ACL2021 paper Consistency Regularization for Cross-Lingual Fine-Tuning.

xTune Code for ACL2021 paper Consistency Regularization for Cross-Lingual Fine-Tuning. Environment DockerFile: dancingsoul/pytorch:xTune Install the f

Bo Zheng 42 Dec 09, 2022
Pytorch implementation for reproducing StackGAN_v2 results in the paper StackGAN++: Realistic Image Synthesis with Stacked Generative Adversarial Networks

StackGAN-v2 StackGAN-v1: Tensorflow implementation StackGAN-v1: Pytorch implementation Inception score evaluation Pytorch implementation for reproduci

Han Zhang 809 Dec 16, 2022
Python parser for DTED data.

DTED Parser This is a package written in pure python (with help from numpy) to parse and investigate Digital Terrain Elevation Data (DTED) files. This

Ben Bonenfant 12 Dec 18, 2022
A scientific and useful toolbox, which contains practical and effective long-tail related tricks with extensive experimental results

Bag of tricks for long-tailed visual recognition with deep convolutional neural networks This repository is the official PyTorch implementation of AAA

Yong-Shun Zhang 181 Dec 28, 2022
Custom IMDB Dataset is extracted between 2020-2021 and custom distilBERT model is trained for movie success probability prediction

IMDB Success Predictor Project involves Web Scraping custom IMDB data between 2020 and 2021 of 10000 movies and shows sorted by number of votes ,fine

Gautam Diwan 1 Jan 18, 2022
This is the implementation of our work Deep Extreme Cut (DEXTR), for object segmentation from extreme points.

This is the implementation of our work Deep Extreme Cut (DEXTR), for object segmentation from extreme points.

Sergi Caelles 828 Jan 05, 2023
Transparent Transformer Segmentation

Transparent Transformer Segmentation Introduction This repository contains the data and code for IJCAI 2021 paper Segmenting transparent object in the

谢恩泽 140 Jan 02, 2023
Deep generative models of 3D grids for structure-based drug discovery

What is liGAN? liGAN is a research codebase for training and evaluating deep generative models for de novo drug design based on 3D atomic density grid

Matt Ragoza 152 Jan 03, 2023
A trusty face recognition research platform developed by Tencent Youtu Lab

Introduction TFace: A trusty face recognition research platform developed by Tencent Youtu Lab. It provides a high-performance distributed training fr

Tencent 956 Jan 01, 2023
Official implementation of VQ-Diffusion

Vector Quantized Diffusion Model for Text-to-Image Synthesis Overview This is the official repo for the paper: [Vector Quantized Diffusion Model for T

Microsoft 592 Jan 03, 2023
Label Mask for Multi-label Classification

LM-MLC 一种基于完型填空的多标签分类算法 1 前言 本文主要介绍本人在全球人工智能技术创新大赛【赛道一】设计的一种基于完型填空(模板)的多标签分类算法:LM-MLC,该算法拟合能力很强能感知标签关联性,在多个数据集上测试表明该算法与主流算法无显著性差异,在该比赛数据集上的dev效果很好,但是由

52 Nov 20, 2022
Official implementation of Deep Reparametrization of Multi-Frame Super-Resolution and Denoising

Deep-Rep-MFIR Official implementation of Deep Reparametrization of Multi-Frame Super-Resolution and Denoising Publication: Deep Reparametrization of M

Goutam Bhat 39 Jan 04, 2023
RL Algorithms with examples in Python / Pytorch / Unity ML agents

Reinforcement Learning Project This project was created to make it easier to get started with Reinforcement Learning. It now contains: An implementati

Rogier Wachters 3 Aug 19, 2022
Plenoxels: Radiance Fields without Neural Networks

Plenoxels: Radiance Fields without Neural Networks Alex Yu*, Sara Fridovich-Keil*, Matthew Tancik, Qinhong Chen, Benjamin Recht, Angjoo Kanazawa UC Be

Sara Fridovich-Keil 81 Dec 25, 2022
Part-aware Measurement for Robust Multi-View Multi-Human 3D Pose Estimation and Tracking

Part-aware Measurement for Robust Multi-View Multi-Human 3D Pose Estimation and Tracking Part-Aware Measurement for Robust Multi-View Multi-Human 3D P

19 Oct 27, 2022
RipsNet: a general architecture for fast and robust estimation of the persistent homology of point clouds

RipsNet: a general architecture for fast and robust estimation of the persistent homology of point clouds This repository contains the code asscoiated

Felix Hensel 14 Dec 12, 2022
Set of models for classifcation of 3D volumes

Classification models 3D Zoo - Keras and TF.Keras This repository contains 3D variants of popular CNN models for classification like ResNets, DenseNet

69 Dec 28, 2022