State of the Art Neural Networks for Deep Learning

Overview

pyradox

This python library helps you with implementing various state of the art neural networks in a totally customizable fashion using Tensorflow 2


Installation

pip install git+https://github.com/Ritvik19/pyradox.git

Usage

Modules

Module Description Input Shape Output Shape Usage
Rescale A layer that rescales the input: x_out = (x_in -mu) / sigma Arbitrary Same shape as input check here
Convolution 2D Applies 2D Convolution followed by Batch Normalization (optional) and Dropout (optional) 4D tensor with shape (batch_shape, rows, cols, channels) 4D tensor with shape (batch_shape, new_rows, new_cols, new_channels) check here
Densely Connected Densely Connected Layer followed by Batch Normalization (optional) and Dropout (optional) 2D tensor with shape (batch_size, input_dim) 2D tensor with shape (batch_size, n_units) check here
DenseNet Convolution Block A Convolution block for DenseNets 4D tensor with shape (batch_shape, rows, cols, channels) 4D tensor with shape (batch_shape, new_rows, new_cols, new_channels) check here
DenseNet Convolution Block A Convolution block for DenseNets 4D tensor with shape (batch_shape, rows, cols, channels) 4D tensor with shape (batch_shape, new_rows, new_cols, new_channels) check here
DenseNet Transition Block A Transition block for DenseNets 4D tensor with shape (batch_shape, rows, cols, channels) 4D tensor with shape (batch_shape, new_rows, new_cols, new_channels) check here
Dense Skip Connection Implementation of a skip connection for densely connected layer 2D tensor with shape (batch_size, input_dim) 2D tensor with shape (batch_size, n_units) check here
VGG Module Implementation of VGG Modules with slight modifications, Applies multiple 2D Convolution followed by Batch Normalization (optional), Dropout (optional) and MaxPooling 4D tensor with shape (batch_shape, rows, cols, channels) 4D tensor with shape (batch_shape, new_rows, new_cols, new_channels) check here
Inception Conv Implementation of 2D Convolution Layer for Inception Net, Convolution Layer followed by Batch Normalization, Activation and optional Dropout 4D tensor with shape (batch_shape, rows, cols, channels) 4D tensor with shape (batch_shape, new_rows, new_cols, new_channels) check here
Inception Block Implementation on Inception Mixing Block 4D tensor with shape (batch_shape, rows, cols, channels) 4D tensor with shape (batch_shape, new_rows, new_cols, new_channels) check here
Xception Block A customised implementation of Xception Block (Depthwise Separable Convolutions) 4D tensor with shape (batch_shape, rows, cols, channels) 4D tensor with shape (batch_shape, new_rows, new_cols, new_channels) check here
Efficient Net Block Implementation of Efficient Net Block (Depthwise Separable Convolutions) 4D tensor with shape (batch_shape, rows, cols, channels) 4D tensor with shape (batch_shape, new_rows, new_cols, new_channels) check here
Conv Skip Connection Implementation of Skip Connection for Convolution Layer 4D tensor with shape (batch_shape, rows, cols, channels) 4D tensor with shape (batch_shape, new_rows, new_cols, new_channels) check here
Res Net Block Customized Implementation of ResNet Block 4D tensor with shape (batch_shape, rows, cols, channels) 4D tensor with shape (batch_shape, new_rows, new_cols, new_channels) check here
Res Net V2 Block Customized Implementation of ResNetV2 Block 4D tensor with shape (batch_shape, rows, cols, channels) 4D tensor with shape (batch_shape, new_rows, new_cols, new_channels) check here
Res NeXt Block Customized Implementation of ResNeXt Block 4D tensor with shape (batch_shape, rows, cols, channels) 4D tensor with shape (batch_shape, new_rows, new_cols, new_channels) check here
Inception Res Net Conv 2D Implementation of Convolution Layer for Inception Res Net: Convolution2d followed by Batch Norm 4D tensor with shape (batch_shape, rows, cols, channels) 4D tensor with shape (batch_shape, new_rows, new_cols, new_channels) check here
Inception Res Net Block Implementation of Inception-ResNet block 4D tensor with shape (batch_shape, rows, cols, channels) 4D tensor with shape (batch_shape, new_rows, new_cols, new_channels) block 8 Block 17 Block 35
NAS Net Separable Conv Block Adds 2 blocks of Separable Conv Batch Norm 4D tensor with shape (batch_shape, rows, cols, channels) 4D tensor with shape (batch_shape, new_rows, new_cols, new_channels) check here
NAS Net Adjust Block Adjusts the input previous path to match the shape of the input
NAS Net Normal A Cell Normal cell for NASNet-A
NAS Net Reduction A Cell Reduction cell for NASNet-A
Mobile Net Conv Block Adds an initial convolution layer with batch normalization and activation 4D tensor with shape (batch_shape, rows, cols, channels) 4D tensor with shape (batch_shape, new_rows, new_cols, new_channels) check here
Mobile Net Depth Wise Conv Block Adds a depthwise convolution block. A depthwise convolution block consists of a depthwise conv, batch normalization, activation, pointwise convolution, batch normalization and activation 4D tensor with shape (batch_shape, rows, cols, channels) 4D tensor with shape (batch_shape, new_rows, new_cols, new_channels) check here
Inverted Res Block Adds an Inverted ResNet block 4D tensor with shape (batch_shape, rows, cols, channels) 4D tensor with shape (batch_shape, new_rows, new_cols, new_channels) check here
SEBlock Adds a Squeeze Excite Block 4D tensor with shape (batch_shape, rows, cols, channels) 4D tensor with shape (batch_shape, new_rows, new_cols, new_channels) check here

ConvNets

Module Description Input Shape Output Shape Usage
Generalized Dense Nets A generalization of Densely Connected Convolutional Networks (Dense Nets) 4D tensor with shape (batch_shape, rows, cols, channels) 4D tensor with shape (batch_shape, new_rows, new_cols, new_channels) check here
Densely Connected Convolutional Network 121 A modified implementation of Densely Connected Convolutional Network 121 4D tensor with shape (batch_shape, rows, cols, channels) 4D tensor with shape (batch_shape, new_rows, new_cols, new_channels) check here
Densely Connected Convolutional Network 169 A modified implementation of Densely Connected Convolutional Network 169 4D tensor with shape (batch_shape, rows, cols, channels) 4D tensor with shape (batch_shape, new_rows, new_cols, new_channels) check here
Densely Connected Convolutional Network 201 A modified implementation of Densely Connected Convolutional Network 201 4D tensor with shape (batch_shape, rows, cols, channels) 4D tensor with shape (batch_shape, new_rows, new_cols, new_channels) check here
Generalized VGG A generalization of VGG network 4D tensor with shape (batch_shape, rows, cols, channels) 4D or 2D tensor usage 1 usage 2
VGG 16 A modified implementation of VGG16 network 4D tensor with shape (batch_shape, rows, cols, channels) 2D tensor with shape (batch_shape, new_dim) usage 1 usage 2
VGG 19 A modified implementation of VGG19 network 4D tensor with shape (batch_shape, rows, cols, channels) 2D tensor with shape (batch_shape, new_dim) usage 1 usage 2
Inception V3 Customized Implementation of Inception Net 4D tensor with shape (batch_shape, rows, cols, channels) 4D tensor with shape (batch_shape, new_rows, new_cols, new_channels) check here
Generalized Xception Generalized Implementation of XceptionNet (Depthwise Separable Convolutions) 4D tensor with shape (batch_shape, rows, cols, channels) 4D tensor with shape (batch_shape, new_rows, new_cols, new_channels) check here
Xception Net A Customised Implementation of XceptionNet 4D tensor with shape (batch_shape, rows, cols, channels) 4D tensor with shape (batch_shape, new_rows, new_cols, new_channels) check here
Efficient Net Generalized Implementation of Effiecient Net 4D tensor with shape (batch_shape, rows, cols, channels) 4D tensor with shape (batch_shape, new_rows, new_cols, new_channels) check here
Efficient Net B0 Customized Implementation of Efficient Net B0 4D tensor with shape (batch_shape, rows, cols, channels) 4D tensor with shape (batch_shape, new_rows, new_cols, new_channels) check here
Efficient Net B1 Customized Implementation of Efficient Net B1 4D tensor with shape (batch_shape, rows, cols, channels) 4D tensor with shape (batch_shape, new_rows, new_cols, new_channels) check here
Efficient Net B2 Customized Implementation of Efficient Net B2 4D tensor with shape (batch_shape, rows, cols, channels) 4D tensor with shape (batch_shape, new_rows, new_cols, new_channels) check here
Efficient Net B3 Customized Implementation of Efficient Net B3 4D tensor with shape (batch_shape, rows, cols, channels) 4D tensor with shape (batch_shape, new_rows, new_cols, new_channels) check here
Efficient Net B4 Customized Implementation of Efficient Net B4 4D tensor with shape (batch_shape, rows, cols, channels) 4D tensor with shape (batch_shape, new_rows, new_cols, new_channels) check here
Efficient Net B5 Customized Implementation of Efficient Net B5 4D tensor with shape (batch_shape, rows, cols, channels) 4D tensor with shape (batch_shape, new_rows, new_cols, new_channels) check here
Efficient Net B6 Customized Implementation of Efficient Net B6 4D tensor with shape (batch_shape, rows, cols, channels) 4D tensor with shape (batch_shape, new_rows, new_cols, new_channels) check here
Efficient Net B7 Customized Implementation of Efficient Net B7 4D tensor with shape (batch_shape, rows, cols, channels) 4D tensor with shape (batch_shape, new_rows, new_cols, new_channels) check here
Res Net Customized Implementation of Res Net 4D tensor with shape (batch_shape, rows, cols, channels) 4D tensor with shape (batch_shape, new_rows, new_cols, new_channels) check here
Res Net 50 Customized Implementation of Res Net 50 4D tensor with shape (batch_shape, rows, cols, channels) 4D tensor with shape (batch_shape, new_rows, new_cols, new_channels) check here
Res Net 101 Customized Implementation of Res Net 101 4D tensor with shape (batch_shape, rows, cols, channels) 4D tensor with shape (batch_shape, new_rows, new_cols, new_channels) check here
Res Net 152 Customized Implementation of Res Net 152 4D tensor with shape (batch_shape, rows, cols, channels) 4D tensor with shape (batch_shape, new_rows, new_cols, new_channels) check here
Res Net V2 Customized Implementation of Res Net V2 4D tensor with shape (batch_shape, rows, cols, channels) 4D tensor with shape (batch_shape, new_rows, new_cols, new_channels) check here
Res Net 50 V2 Customized Implementation of Res Net 50 V2 4D tensor with shape (batch_shape, rows, cols, channels) 4D tensor with shape (batch_shape, new_rows, new_cols, new_channels) check here
Res Net 101 V2 Customized Implementation of Res Net 101 V2 4D tensor with shape (batch_shape, rows, cols, channels) 4D tensor with shape (batch_shape, new_rows, new_cols, new_channels) check here
Res Net 152 V2 Customized Implementation of Res Net 152 V2 4D tensor with shape (batch_shape, rows, cols, channels) 4D tensor with shape (batch_shape, new_rows, new_cols, new_channels) check here
Res NeXt Customized Implementation of Res NeXt 4D tensor with shape (batch_shape, rows, cols, channels) 4D tensor with shape (batch_shape, new_rows, new_cols, new_channels) check here
Res NeXt 50 Customized Implementation of Res NeXt 50 4D tensor with shape (batch_shape, rows, cols, channels) 4D tensor with shape (batch_shape, new_rows, new_cols, new_channels) check here
Res NeXt 101 Customized Implementation of Res NeXt 101 4D tensor with shape (batch_shape, rows, cols, channels) 4D tensor with shape (batch_shape, new_rows, new_cols, new_channels) check here
Res NeXt 152 Customized Implementation of Res NeXt 152 4D tensor with shape (batch_shape, rows, cols, channels) 4D tensor with shape (batch_shape, new_rows, new_cols, new_channels) check here
Inception Res Net V2 Customized Implementation of Inception Res Net V2 4D tensor with shape (batch_shape, rows, cols, channels) 4D tensor with shape (batch_shape, new_rows, new_cols, new_channels) check here
NAS Net Generalised Implementation of NAS Net 4D tensor with shape (batch_shape, rows, cols, channels) 4D tensor with shape (batch_shape, new_rows, new_cols, new_channels) check here
NAS Net Mobile Customized Implementation of NAS Net Mobile 4D tensor with shape (batch_shape, rows, cols, channels) 4D tensor with shape (batch_shape, new_rows, new_cols, new_channels) check here
NAS Net Large Customized Implementation of NAS Net Large 4D tensor with shape (batch_shape, rows, cols, channels) 4D tensor with shape (batch_shape, new_rows, new_cols, new_channels) check here
MobileNet Customized Implementation of MobileNet 4D tensor with shape (batch_shape, rows, cols, channels) 4D tensor with shape (batch_shape, new_rows, new_cols, new_channels) usage 1 usage 2
Mobile Net V2 Customized Implementation of Mobile Net V2 4D tensor with shape (batch_shape, rows, cols, channels) 4D tensor with shape (batch_shape, new_rows, new_cols, new_channels) usage 1 usage 2
Mobile Net V3 Customized Implementation of Mobile Net V3 4D tensor with shape (batch_shape, rows, cols, channels) 4D tensor with shape (batch_shape, new_rows, new_cols, new_channels) usage 1 usage 2

DenseNets

Module Description Input Shape Output Shape Usage
Densely Connected Network Network of Densely Connected Layers followed by Batch Normalization (optional) and Dropout (optional) 2D tensor with shape (batch_size, input_dim) 2D tensor with shape (batch_size, new_dim) check here
Densely Connected Resnet Network of skip connections for densely connected layer 2D tensor with shape (batch_size, input_dim) 2D tensor with shape (batch_size, new_dim) check here
You might also like...
State-of-the-art data augmentation search algorithms in PyTorch
State-of-the-art data augmentation search algorithms in PyTorch

MuarAugment Description MuarAugment is a package providing the easiest way to a state-of-the-art data augmentation pipeline. How to use You can instal

A selection of State Of The Art research papers (and code) on human locomotion (pose + trajectory) prediction (forecasting)

A selection of State Of The Art research papers (and code) on human trajectory prediction (forecasting). Papers marked with [W] are workshop papers.

A state of the art of new lightweight YOLO model implemented by TensorFlow 2.
A state of the art of new lightweight YOLO model implemented by TensorFlow 2.

CSL-YOLO: A New Lightweight Object Detection System for Edge Computing This project provides a SOTA level lightweight YOLO called "Cross-Stage Lightwe

We evaluate our method on different datasets (including ShapeNet, CUB-200-2011, and Pascal3D+) and achieve state-of-the-art results, outperforming all the other supervised and unsupervised methods and 3D representations, all in terms of performance, accuracy, and training time. FastReID is a research platform that implements state-of-the-art re-identification algorithms.
FastReID is a research platform that implements state-of-the-art re-identification algorithms.

FastReID is a research platform that implements state-of-the-art re-identification algorithms.

Summary Explorer is a tool to visually explore the state-of-the-art in text summarization.
Summary Explorer is a tool to visually explore the state-of-the-art in text summarization.

Summary Explorer Summary Explorer is a tool to visually inspect the summaries from several state-of-the-art neural summarization models across multipl

PaddleViT: State-of-the-art Visual Transformer and MLP Models for PaddlePaddle 2.0+
PaddleViT: State-of-the-art Visual Transformer and MLP Models for PaddlePaddle 2.0+

PaddlePaddle Vision Transformers State-of-the-art Visual Transformer and MLP Models for PaddlePaddle 🤖 PaddlePaddle Visual Transformers (PaddleViT or

🤗 Transformers: State-of-the-art Natural Language Processing for Pytorch, TensorFlow, and JAX.
🤗 Transformers: State-of-the-art Natural Language Processing for Pytorch, TensorFlow, and JAX.

English | 简体中文 | 繁體中文 State-of-the-art Natural Language Processing for Jax, PyTorch and TensorFlow 🤗 Transformers provides thousands of pretrained mo

Fuzzification helps developers protect the released, binary-only software from attackers who are capable of applying state-of-the-art fuzzing techniques

About Fuzzification Fuzzification helps developers protect the released, binary-only software from attackers who are capable of applying state-of-the-

Comments
Releases(v1.0.1)
Owner
Ritvik Rastogi
I have been writing code since 2016, and taught myself a handful of skills and programming languages. I love solving problems by writing code
Ritvik Rastogi
Co-GAIL: Learning Diverse Strategies for Human-Robot Collaboration

CoGAIL Table of Content Overview Installation Dataset Training Evaluation Trained Checkpoints Acknowledgement Citations License Overview This reposito

Jeremy Wang 29 Dec 24, 2022
Apply our monocular depth boosting to your own network!

MergeNet - Boost Your Own Depth Boost custom or edited monocular depth maps using MergeNet Input Original result After manual editing of base You can

Computational Photography Lab @ SFU 142 Dec 17, 2022
A small library of 3D related utilities used in my research.

utils3D A small library of 3D related utilities used in my research. Installation Install via GitHub pip install git+https://github.com/Steve-Tod/util

Zhenyu Jiang 8 May 20, 2022
The spiritual successor to knockknock for PyTorch Lightning, get notified when your training ends

Who's there? The spiritual successor to knockknock for PyTorch Lightning, to get a notification when your training is complete or when it crashes duri

twsl 70 Oct 06, 2022
A scientific and useful toolbox, which contains practical and effective long-tail related tricks with extensive experimental results

Bag of tricks for long-tailed visual recognition with deep convolutional neural networks This repository is the official PyTorch implementation of AAA

Yong-Shun Zhang 181 Dec 28, 2022
[ICML 2022] The official implementation of Graph Stochastic Attention (GSAT).

Graph Stochastic Attention (GSAT) The official implementation of GSAT for our paper: Interpretable and Generalizable Graph Learning via Stochastic Att

85 Nov 27, 2022
Densely Connected Convolutional Networks, In CVPR 2017 (Best Paper Award).

Densely Connected Convolutional Networks (DenseNets) This repository contains the code for DenseNet introduced in the following paper Densely Connecte

Zhuang Liu 4.5k Jan 03, 2023
Joint Channel and Weight Pruning for Model Acceleration on Mobile Devices

Joint Channel and Weight Pruning for Model Acceleration on Mobile Devices Abstract For practical deep neural network design on mobile devices, it is e

11 Dec 30, 2022
Framework for estimating the structures and parameters of Bayesian networks (DAGs) at per-sample resolution

Sample-specific Bayesian Networks A framework for estimating the structures and parameters of Bayesian networks (DAGs) at per-sample or per-patient re

Caleb Ellington 1 Sep 23, 2022
[NeurIPS 2021 Spotlight] Aligning Pretraining for Detection via Object-Level Contrastive Learning

SoCo [NeurIPS 2021 Spotlight] Aligning Pretraining for Detection via Object-Level Contrastive Learning By Fangyun Wei*, Yue Gao*, Zhirong Wu, Han Hu,

Yue Gao 139 Dec 14, 2022
Neural Motion Learner With Python

Neural Motion Learner Introduction This work is to extract skeletal structure from volumetric observations and to learn motion dynamics from the detec

Jinseok Bae 14 Nov 28, 2022
PyContinual (An Easy and Extendible Framework for Continual Learning)

PyContinual (An Easy and Extendible Framework for Continual Learning) Easy to Use You can sumply change the baseline, backbone and task, and then read

176 Jan 05, 2023
Deep motion generator collections

GenMotion GenMotion (/gen’motion/) is a Python library for making skeletal animations. It enables easy dataset loading and experiment sharing for synt

23 May 24, 2022
Minimal diffusion models - Minimal code and simple experiments to play with Denoising Diffusion Probabilistic Models (DDPMs)

Minimal code and simple experiments to play with Denoising Diffusion Probabilist

Rithesh Kumar 16 Oct 06, 2022
A trashy useless Latin programming language written in python.

Codigum! The first programming langage in latin! (please keep your eyes closed when if you read the source code) It is pretty useless though. Document

Bic 2 Oct 25, 2021
Survival analysis (SA) is a well-known statistical technique for the study of temporal events.

DAGSurv Survival analysis (SA) is a well-known statistical technique for the study of temporal events. In SA, time-to-an-event data is modeled using a

Rahul Kukreja 1 Sep 05, 2022
Codes for Causal Semantic Generative model (CSG), the model proposed in "Learning Causal Semantic Representation for Out-of-Distribution Prediction" (NeurIPS-21)

Learning Causal Semantic Representation for Out-of-Distribution Prediction This repository is the official implementation of "Learning Causal Semantic

Chang Liu 54 Dec 01, 2022
Fast Neural Style for Image Style Transform by Pytorch

FastNeuralStyle by Pytorch Fast Neural Style for Image Style Transform by Pytorch This is famous Fast Neural Style of Paper Perceptual Losses for Real

Bengxy 81 Sep 03, 2022
Unsupervised Image to Image Translation with Generative Adversarial Networks

Unsupervised Image to Image Translation with Generative Adversarial Networks Paper: Unsupervised Image to Image Translation with Generative Adversaria

Hao 71 Oct 30, 2022
Library for converting from RGB / GrayScale image to base64 and back.

Library for converting RGB / Grayscale numpy images from to base64 and back. Installation pip install -U image_to_base_64 Conversion RGB to base 64 b

Vladimir Iglovikov 16 Aug 28, 2022