Local Multi-Head Channel Self-Attention for FER2013

Related tags

Deep LearningLHC_Net
Overview

LHC-Net

Local Multi-Head Channel Self-Attention

This repository is intended to provide a quick implementation of the LHC-Net and to replicate the results in this paper on FER2013 by downloading our trained models or, in case of hardware compatibility, by training the models from scratch. A fully custom training routine is also available.

Image of LHC_Net Image of LHC_Module2

How to check the replicability of our results without full training

Bit-exact replicability is strongly hardware dependent. Since the results we presented depend on the choice of a very good performing starting ResNet34v2 model, we strongly recommend to run the replicability script before attempting to execute our training protocol which is computational intensive and time consuming.
Execute the following commands in your terminal:

python Download_Data.py
python ETL.py
python check_rep.py

Ore equivalently:

python main_check_rep.py

If you get the output "Replicable Results!" you will 99% get our exact result, otherwise if you get "Not Replicable Results. Change your GPU!" you won't be able to get our results.

Please note that Download_Data.py will download the FER2013 dataset in .csv format while ETL.py will save all the 28709 images of the training set in .jpeg format in order to allow the use of TensorFlow image data generator and save some memory.

Recommended setup for full replicability:
Nvidia Geforce GTX-1080ti (other Pascal-based GPUs might work)
GPU Driver 457.51
Cuda Driver 11.1.1*
CuDNN v8.0.5 - 11.1
Python 3.8.5
requirements.txt

*After Cuda installation rename C:...\NVIDIA GPU Computing Toolkit\CUDA\v11.1\bin\cusolver64_11.dll in cusolver64_10.dll

How to download our trained models and evaluate their performances on FER2013

Execute the following commands in your terminal:

python Download_Data.py
python Download_Models.py
python LHC_Downloaded_Eval.py
python Controller_Downloaded_Eval.py

Ore equivalently:

python main_downloaded.py

How to train and evaluate your own LHC-Net on FER2013 in the "standalone" mode

To train an LHC-Net using a generically imagenet pre-trained ResNet backbone edit the configuration files in the Settings folder and execute the following commands in your terminal:

python Download_Data.py
python ETL.py
python LHC_Net_Train.py
python LHC_Net_Eval.py

Ore equivalently:

python main_standalone.py

How to train and evalueate LHC-Net on FER2013 in our "modular" mode and replicate our results

If the replicability check gave a positive result you could replicate our results by integrating and training the LHC modules on a ResNet backbone already trained on FER2013, according with our first experimental protocol. To do that execute the following commands in your terminal:

python Download_Data.py
python ETL.py
python ResNet34_Train.py
python LHC_Train.py
python Controller_Train.py
python LHC_Eval.py
python Controller_Eval.py

Ore equivalently:

python main_modular.py
This repo is the official implementation for Multi-Scale Adaptive Graph Neural Network for Multivariate Time Series Forecasting

1 MAGNN This repo is the official implementation for Multi-Scale Adaptive Graph Neural Network for Multivariate Time Series Forecasting. 1.1 The frame

SZJ 12 Nov 08, 2022
Towards Debiasing NLU Models from Unknown Biases

Towards Debiasing NLU Models from Unknown Biases Abstract: NLU models often exploit biased features to achieve high dataset-specific performance witho

Ubiquitous Knowledge Processing Lab 22 Jun 14, 2022
Code for "Contextual Non-Local Alignment over Full-Scale Representation for Text-Based Person Search"

Contextual Non-Local Alignment over Full-Scale Representation for Text-Based Person Search This is an implementation for our paper Contextual Non-Loca

Tencent YouTu Research 50 Dec 03, 2022
Hub is a dataset format with a simple API for creating, storing, and collaborating on AI datasets of any size.

Hub is a dataset format with a simple API for creating, storing, and collaborating on AI datasets of any size. The hub data layout enables rapid transformations and streaming of data while training m

Activeloop 5.1k Jan 08, 2023
Plaything for Autistic Children (demo for PaddlePaddle/Wechaty/Mixlab project)

星星的孩子 - 一款为孤独症孩子设计的聊天机器人游戏 孤独症儿童是目前常常被忽视的一类群体。他们有着类似性格内向的特征,实际却受着广泛性发育障碍的折磨。 项目背景 这类儿童在与人交往时存在着沟通障碍,其特点表现在: 社交交流差,互动障碍明显 认知能力有限,被动认知 兴趣狭窄,重复刻板,缺乏变化和想象

Tianyi Pan 35 Nov 24, 2022
All materials of Cassandra Event, Udyam'22

Cassandra 2022 Workspace Workshop Materials Workshop-1 Workshop-2 Workshop-3 Workshop-4 Assignments Assignment-1 Assignment-2 Assignment-3 Resources P

36 Dec 31, 2022
Official Implementation of "Designing an Encoder for StyleGAN Image Manipulation"

Designing an Encoder for StyleGAN Image Manipulation (SIGGRAPH 2021) Recently, there has been a surge of diverse methods for performing image editing

749 Jan 09, 2023
Ready-to-use code and tutorial notebooks to boost your way into few-shot image classification.

Easy Few-Shot Learning Ready-to-use code and tutorial notebooks to boost your way into few-shot image classification. This repository is made for you

Sicara 399 Jan 08, 2023
Face Detection & Age Gender & Expression & Recognition

Face Detection & Age Gender & Expression & Recognition

Sajjad Ayobi 188 Dec 28, 2022
PyTorch implementation of TSception V2 using DEAP dataset

TSception This is the PyTorch implementation of TSception V2 using DEAP dataset in our paper: Yi Ding, Neethu Robinson, Su Zhang, Qiuhao Zeng, Cuntai

Yi Ding 27 Dec 15, 2022
Pytorch implementation for "Distribution-Balanced Loss for Multi-Label Classification in Long-Tailed Datasets" (ECCV 2020 Spotlight)

Distribution-Balanced Loss [Paper] The implementation of our paper Distribution-Balanced Loss for Multi-Label Classification in Long-Tailed Datasets (

Tong WU 304 Dec 22, 2022
Notification Triggers for Python

Notipyer Notification triggers for Python Send async email notifications via Python. Get updates/crashlogs from your scripts with ease. Installation p

Chirag Jain 17 May 16, 2022
Training BERT with Compute/Time (Academic) Budget

Training BERT with Compute/Time (Academic) Budget This repository contains scripts for pre-training and finetuning BERT-like models with limited time

Intel Labs 263 Jan 07, 2023
Grow Function: Generate 3D Stacked Bifurcating Double Deep Cellular Automata based organisms which differentiate using a Genetic Algorithm...

Grow Function: A 3D Stacked Bifurcating Double Deep Cellular Automata which differentiates using a Genetic Algorithm... TLDR;High Def Trees that you can mint as NFTs on Solana

Nathaniel Gibson 4 Oct 08, 2022
The materials used in the SaxonJS tutorial presented at Declarative Amsterdam, 2021

SaxonJS-Tutorial-2021, version 1.0.4 Last updated on 4 November, 2021. Table of contents Background Prerequisites Starting a web server Running a Java

Saxonica 11 Oct 23, 2022
Algorithmic trading using machine learning.

Algorithmic Trading This machine learning algorithm was built using Python 3 and scikit-learn with a Decision Tree Classifier. The program gathers sto

Sourav Biswas 101 Nov 10, 2022
RoFormer_pytorch

PyTorch RoFormer 原版Tensorflow权重(https://github.com/ZhuiyiTechnology/roformer) chinese_roformer_L-12_H-768_A-12.zip (提取码:xy9x) 已经转化为PyTorch权重 chinese_r

yujun 283 Dec 12, 2022
Continuous Augmented Positional Embeddings (CAPE) implementation for PyTorch

PyTorch implementation of Continuous Augmented Positional Embeddings (CAPE), by Likhomanenko et al. Enhance your Transformer positional embeddings with easy-to-use augmentations!

Guillermo Cámbara 26 Dec 13, 2022
Meta-Learning Sparse Implicit Neural Representations (NeurIPS 2021)

Meta-SparseINR Official PyTorch implementation of "Meta-learning Sparse Implicit Neural Representations" (NeurIPS 2021) by Jaeho Lee*, Jihoon Tack*, N

Jaeho Lee 41 Nov 10, 2022
🤗 Paper Style Guide

🤗 Paper Style Guide (Work in progress, send a PR!) Libraries to Know booktabs natbib cleveref Either seaborn, plotly or altair for graphs algorithmic

Hugging Face 66 Dec 12, 2022