Skip to content

iVishalr/Recurrent-Networks

Folders and files

NameName
Last commit message
Last commit date

Latest commit

 

History

6 Commits
 
 
 
 
 
 
 
 
 
 
 
 
 
 

Repository files navigation

Recurrent Neural Networks

Recurrent Networks

Implements simple recurrent network and a stacked recurrent network in numpy and torch respectively. Both flavours implements a forward and backward function API that is resposible for handling the model behaviour in forward pass and backward pass. Backward pass has been implemented using native numpy/torch tensors and no autograd engines have been used to perform the backward pass.

main.py is a thin wrapper that calls the appropriate model class and trains a recurrent network on tinyshakespeare. After training for a while, we can autoregressively sample random poems from the model.

Models present in models/ can also be trained on linear algebra and linux source code.

Requirements

  1. numpy
  2. pytorch

Training

Edit the main.py file to configure a RNN model by specifying number of hidden layers, sequence_length and so on. Exceute the following command in terminal.

$ python3 main.py

TODO

  1. Multilayer GRU and LSTM
  2. Transformer

License

MIT

About

Implements Stacked-RNN in numpy and torch with manual forward and backward functions

Resources

License

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published

Languages