Skip to content

PyTorch reimplementation of the Smooth ReLU activation function proposed in the paper "Real World Large Scale Recommendation Systems Reproducibility and Smooth Activations" [arXiv 2022].

License

ChristophReich1996/SmeLU

Folders and files

NameName
Last commit message
Last commit date

Latest commit

 

History

11 Commits
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 

Repository files navigation

Smooth ReLU in PyTorch

License: MIT

drawingdrawing

Unofficial PyTorch reimplementation of the Smooth ReLU (SmeLU) activation function proposed in the paper Real World Large Scale Recommendation Systems Reproducibility and Smooth Activations by Gil I. Shamir and Dong Lin.

This repository includes an easy-to-use pure PyTorch implementation of the Smooth ReLU.

In case you run into performance issues with this implementation, please have a look at my Triton SmeLU implementation.

Installation

The SmeLU can be installed by using pip.

pip install git+https://github.com/ChristophReich1996/SmeLU

Example Usage

The SmeLU can be simply used as a standard nn.Module:

import torch
import torch.nn as nn
from smelu import SmeLU

network: nn.Module = nn.Sequential(
    nn.Linear(2, 2),
    SmeLU(),
    nn.Linear(2, 2)
)

output: torch.Tensor = network(torch.rand(16, 2))

For a more detailed examples on hwo to use this implementation please refer to the example file (requires Matplotlib to be installed).

The SmeLU takes the following parameters.

Parameter Description Type
beta Beta value if the SmeLU activation function. Default 2. float

Reference

@article{Shamir2022,
        title={{Real World Large Scale Recommendation Systems Reproducibility and Smooth Activations}},
        author={Shamir, Gil I and Lin, Dong},
        journal={{arXiv preprint arXiv:2202.06499}},
        year={2022}
}

About

PyTorch reimplementation of the Smooth ReLU activation function proposed in the paper "Real World Large Scale Recommendation Systems Reproducibility and Smooth Activations" [arXiv 2022].

Topics

Resources

License

Stars

Watchers

Forks

Languages