Tensors and neural networks in Haskell

Overview

Hasktorch

Hasktorch is a library for tensors and neural networks in Haskell. It is an independent open source community project which leverages the core C++ libraries shared by PyTorch.

This project is in active development, so expect changes to the library API as it evolves. We would like to invite new users to join our Hasktorch slack space for questions and discussions. Contributions/PR are encouraged.

Currently we are developing the second major release of Hasktorch (0.2). Note the 1st release, Hasktorch 0.1, on hackage is outdated and should not be used.

Documentation

The documentation is divided into several sections:

Introductory Videos

Getting Started

The following steps will get you started. They assume the hasktorch repository has just been cloned. After setup is done, read the online tutorials and API documents.

linux+cabal+cpu

Starting from the top-level directory of the project, run:

$ pushd deps       # Change to the deps directory and save the current directory.
$ ./get-deps.sh    # Run the shell script to retrieve the libtorch dependencies.
$ popd             # Go back to the root directory of the project.
$ source setenv    # Set the shell environment to reference the shared library locations.
$ ./setup-cabal.sh # Create a cabal project file

To build and test the Hasktorch library, run:

$ cabal build hasktorch  # Build the Hasktorch library.
$ cabal test hasktorch   # Build and run the Hasktorch library test suite.

To build and test the example executables shipped with hasktorch, run:

$ cabal build examples  # Build the Hasktorch examples.
$ cabal test examples   # Build and run the Hasktorch example test suites.

To run the MNIST CNN example, run:

$ cd examples                   # Change to the examples directory.
$ ./datasets/download-mnist.sh  # Download the MNIST dataset.
$ mv mnist data                 # Move the MNIST dataset to the data directory.
$ export DEVICE=cpu             # Set device to CPU for the MNIST CNN example.
$ cabal run static-mnist-cnn    # Run the MNIST CNN example.

linux+cabal+cuda11

Starting from the top-level directory of the project, run:

$ pushd deps              # Change to the deps directory and save the current directory.
$ ./get-deps.sh -a cu111  # Run the shell script to retrieve the libtorch dependencies.
$ popd                    # Go back to the root directory of the project.
$ source setenv           # Set the shell environment to reference the shared library locations.
$ ./setup-cabal.sh        # Create a cabal project file

To build and test the Hasktorch library, run:

$ cabal build hasktorch  # Build the Hasktorch library.
$ cabal test hasktorch   # Build and run the Hasktorch library test suite.

To build and test the example executables shipped with hasktorch, run:

$ cabal build examples  # Build the Hasktorch examples.
$ cabal test examples   # Build and run the Hasktorch example test suites.

To run the MNIST CNN example, run:

$ cd examples                   # Change to the examples directory.
$ ./datasets/download-mnist.sh  # Download the MNIST dataset.
$ mv mnist data                 # Move the MNIST dataset to the data directory.
$ export DEVICE="cuda:0"        # Set device to CUDA for the MNIST CNN example.
$ cabal run static-mnist-cnn    # Run the MNIST CNN example.

macos+cabal+cpu

Starting from the top-level directory of the project, run:

$ pushd deps       # Change to the deps directory and save the current directory.
$ ./get-deps.sh    # Run the shell script to retrieve the libtorch dependencies.
$ popd             # Go back to the root directory of the project.
$ source setenv    # Set the shell environment to reference the shared library locations.
$ ./setup-cabal.sh # Create a cabal project file

To build and test the Hasktorch library, run:

$ cabal build hasktorch  # Build the Hasktorch library.
$ cabal test hasktorch   # Build and run the Hasktorch library test suite.

To build and test the example executables shipped with hasktorch, run:

$ cabal build examples  # Build the Hasktorch examples.
$ cabal test examples   # Build and run the Hasktorch example test suites.

To run the MNIST CNN example, run:

$ cd examples                   # Change to the examples directory.
$ ./datasets/download-mnist.sh  # Download the MNIST dataset.
$ mv mnist data                 # Move the MNIST dataset to the data directory.
$ export DEVICE=cpu             # Set device to CPU for the MNIST CNN example.
$ cabal run static-mnist-cnn    # Run the MNIST CNN example.

linux+stack+cpu

Install the Haskell Tool Stack if you haven't already, following instructions here

Starting from the top-level directory of the project, run:

$ pushd deps     # Change to the deps directory and save the current directory.
$ ./get-deps.sh  # Run the shell script to retrieve the libtorch dependencies.
$ popd           # Go back to the root directory of the project.
$ source setenv  # Set the shell environment to reference the shared library locations.

To build and test the Hasktorch library, run:

$ stack build hasktorch  # Build the Hasktorch library.
$ stack test hasktorch   # Build and run the Hasktorch library test suite.

To build and test the example executables shipped with hasktorch, run:

$ stack build examples  # Build the Hasktorch examples.
$ stack test examples   # Build and run the Hasktorch example test suites.

To run the MNIST CNN example, run:

$ cd examples                   # Change to the examples directory.
$ ./datasets/download-mnist.sh  # Download the MNIST dataset.
$ mv mnist data                 # Move the MNIST dataset to the data directory.
$ export DEVICE=cpu             # Set device to CPU for the MNIST CNN example.
$ stack run static-mnist-cnn     # Run the MNIST CNN example.

macos+stack+cpu

Install the Haskell Tool Stack if you haven't already, following instructions here

Starting from the top-level directory of the project, run:

$ pushd deps     # Change to the deps directory and save the current directory.
$ ./get-deps.sh  # Run the shell script to retrieve the libtorch dependencies.
$ popd           # Go back to the root directory of the project.
$ source setenv  # Set the shell environment to reference the shared library locations.

To build and test the Hasktorch library, run:

$ stack build hasktorch  # Build the Hasktorch library.
$ stack test hasktorch   # Build and run the Hasktorch library test suite.

To build and test the example executables shipped with hasktorch, run:

$ stack build examples  # Build the Hasktorch examples.
$ stack test examples   # Build and run the Hasktorch example test suites.

To run the MNIST CNN example, run:

$ cd examples                   # Change to the examples directory.
$ ./datasets/download-mnist.sh  # Download the MNIST dataset.
$ mv mnist data                 # Move the MNIST dataset to the data directory.
$ export DEVICE=cpu             # Set device to CPU for the MNIST CNN example.
$ stack run static-mnist-cnn     # Run the MNIST CNN example.

nixos+cabal+cpu

(Optional) Install and set up Cachix:

$ nix-env -iA cachix -f https://cachix.org/api/v1/install  # (Optional) Install Cachix.
$ cachix use iohk                                          # (Optional) Use IOHK's cache.
$ cachix use hasktorch                                     # (Optional) Use hasktorch's cache.

Starting from the top-level directory of the project, run:

$ nix-shell  # Enter the nix shell environment for Hasktorch.

To build and test the Hasktorch library, run:

$ cabal build hasktorch  # Build the Hasktorch library.
$ cabal test hasktorch   # Build and run the Hasktorch library test suite.

To build and test the example executables shipped with hasktorch, run:

$ cabal build examples  # Build the Hasktorch examples.
$ cabal test examples   # Build and run the Hasktorch example test suites.

To run the MNIST CNN example, run:

$ cd examples                   # Change to the examples directory.
$ ./datasets/download-mnist.sh  # Download the MNIST dataset.
$ mv mnist data                 # Move the MNIST dataset to the data directory.
$ export DEVICE=cpu             # Set device to CPU for the MNIST CNN example.
$ cabal run static-mnist-cnn    # Run the MNIST CNN example.

nixos+cabal+cuda11

(Optional) Install and set up Cachix:

$ nix-env -iA cachix -f https://cachix.org/api/v1/install  # (Optional) Install Cachix.
$ cachix use iohk                                          # (Optional) Use IOHK's cache.
$ cachix use hasktorch                                     # (Optional) Use hasktorch's cache.

Starting from the top-level directory of the project, run:

$ nix-shell --arg cudaSupport true --argstr cudaMajorVersion 11  # Enter the nix shell environment for Hasktorch.

To build and test the Hasktorch library, run:

$ cabal build hasktorch  # Build the Hasktorch library.
$ cabal test hasktorch   # Build and run the Hasktorch library test suite.

To build and test the example executables shipped with hasktorch, run:

$ cabal build examples  # Build the Hasktorch examples.
$ cabal test examples   # Build and run the Hasktorch example test suites.

To run the MNIST CNN example, run:

$ cd examples                   # Change to the examples directory.
$ ./datasets/download-mnist.sh  # Download the MNIST dataset.
$ mv mnist data                 # Move the MNIST dataset to the data directory.
$ export DEVICE="cuda:0"        # Set device to CUDA for the MNIST CNN example.
$ cabal run static-mnist-cnn    # Run the MNIST CNN example.

docker+jupyterlab+cuda11

This dockerhub repository provides the docker-image of jupyterlab. It supports cuda11, cuda10 and cpu only. When you use jupyterlab with hasktorch, type following command, then click a url in a console.

$ docker run --gpus all -it --rm -p 8888:8888 htorch/hasktorch-jupyter
or
$ docker run --gpus all -it --rm -p 8888:8888 htorch/hasktorch-jupyter:latest-cu11

Known Issues

Tensors Cannot Be Moved to CUDA

In rare cases, you may see errors like

cannot move tensor to "CUDA:0"

although you have CUDA capable hardware in your machine and have followed the getting-started instructions for CUDA support.

If that happens, check if /run/opengl-driver/lib exists. If not, make sure your CUDA drivers are installed correctly.

Weird Behaviour When Switching from CPU-Only to CUDA-Enabled Nix Shell

If you have run cabal in a CPU-only Hasktorch Nix shell before, you may need to:

  • Clean the dist-newstyle folder using cabal clean.
  • Delete the .ghc.environment* file in the Hasktorch root folder.

Otherwise, at best, you will not be able to move tensors to CUDA, and, at worst, you will see weird linker errors like

gcc: error: hasktorch/dist-newstyle/build/x86_64-linux/ghc-8.8.3/libtorch-ffi-1.5.0.0/build/Torch/Internal/Unmanaged/Autograd.dyn_o: No such file or directory
`cc' failed in phase `Linker'. (Exit code: 1)

Contributing

We welcome new contributors.

Contact us for access to the hasktorch slack channel. You can send an email to [email protected] or on twitter as @austinvhuang, @SamStites, @tscholak, or @junjihashimoto3.

Notes for library developers

See the wiki for developer notes.

Project Folder Structure

Basic functionality:

  • deps/ - submodules and downloads for build dependencies (libtorch, mklml, pytorch) -- you can ignore this if you are on Nix
  • examples/ - high level example models (xor mlp, typed cnn, etc.)
  • experimental/ - experimental projects or tips
  • hasktorch/ - higher level user-facing library, calls into ffi/, used by examples/

Internals (for contributing developers):

  • codegen/ - code generation, parses Declarations.yaml spec from pytorch and produces ffi/ contents
  • inline-c/ - submodule to inline-cpp fork used for C++ FFI
  • libtorch-ffi/- low level FFI bindings to libtorch
  • spec/ - specification files used for codegen/
Pytorch implementation of the AAAI 2022 paper "Cross-Domain Empirical Risk Minimization for Unbiased Long-tailed Classification"

[AAAI22] Cross-Domain Empirical Risk Minimization for Unbiased Long-tailed Classification We point out the overlooked unbiasedness in long-tailed clas

PatatiPatata 28 Oct 18, 2022
We are More than Our JOints: Predicting How 3D Bodies Move

We are More than Our JOints: Predicting How 3D Bodies Move Citation This repo contains the official implementation of our paper MOJO: @inproceedings{Z

72 Oct 20, 2022
Pytorch implementation for the Temporal and Object Quantification Networks (TOQ-Nets).

TOQ-Nets-PyTorch-Release Pytorch implementation for the Temporal and Object Quantification Networks (TOQ-Nets). Temporal and Object Quantification Net

Zhezheng Luo 9 Jun 30, 2022
Video lie detector using xgboost - A video lie detector using OpenFace and xgboost

video_lie_detector_using_xgboost a video lie detector using OpenFace and xgboost

2 Jan 11, 2022
In this project we predict the forest cover type using the cartographic variables in the training/test datasets.

Kaggle Competition: Forest Cover Type Prediction In this project we predict the forest cover type (the predominant kind of tree cover) using the carto

Marianne Joy Leano 1 Mar 15, 2022
My implementation of DeepMind's Perceiver

DeepMind Perceiver (in PyTorch) Disclaimer: This is not official and I'm not affiliated with DeepMind. My implementation of the Perceiver: General Per

Louis Arge 55 Dec 12, 2022
DM-ACME compatible implementation of the Arm26 environment from Mujoco

ACME-compatible implementation of Arm26 from Mujoco This repository contains a customized implementation of Mujoco's Arm26 model, that can be used wit

1 Dec 24, 2021
Out-of-Domain Human Mesh Reconstruction via Dynamic Bilevel Online Adaptation

DynaBOA Code repositoty for the paper: Out-of-Domain Human Mesh Reconstruction via Dynamic Bilevel Online Adaptation Shanyan Guan, Jingwei Xu, Michell

197 Jan 07, 2023
UniLM AI - Large-scale Self-supervised Pre-training across Tasks, Languages, and Modalities

Pre-trained (foundation) models across tasks (understanding, generation and translation), languages (100+ languages), and modalities (language, image, audio, vision + language, audio + language, etc.

Microsoft 7.6k Jan 01, 2023
Bayesian Deep Learning and Deep Reinforcement Learning for Object Shape Error Response and Correction of Manufacturing Systems

Bayesian Deep Learning for Manufacturing 2.0 (dlmfg) Object Shape Error Response (OSER) Digital Lifecycle Management - In Process Quality Improvement

Sumit Sinha 30 Oct 31, 2022
code for paper -- "Seamless Satellite-image Synthesis"

Seamless Satellite-image Synthesis by Jialin Zhu and Tom Kelly. Project site. The code of our models borrows heavily from the BicycleGAN repository an

Light 14 Apr 05, 2022
[ICCV 2021] Code release for "Sub-bit Neural Networks: Learning to Compress and Accelerate Binary Neural Networks"

Sub-bit Neural Networks: Learning to Compress and Accelerate Binary Neural Networks By Yikai Wang, Yi Yang, Fuchun Sun, Anbang Yao. This is the pytorc

Yikai Wang 26 Nov 20, 2022
A treasure chest for visual recognition powered by PaddlePaddle

简体中文 | English PaddleClas 简介 飞桨图像识别套件PaddleClas是飞桨为工业界和学术界所准备的一个图像识别任务的工具集,助力使用者训练出更好的视觉模型和应用落地。 近期更新 2021.11.1 发布PP-ShiTu技术报告,新增饮料识别demo 2021.10.23 发

4.6k Dec 31, 2022
ICCV2021 Oral SA-ConvONet: Sign-Agnostic Optimization of Convolutional Occupancy Networks

Sign-Agnostic Convolutional Occupancy Networks Paper | Supplementary | Video | Teaser Video | Project Page This repository contains the implementation

63 Nov 18, 2022
TSP: Temporally-Sensitive Pretraining of Video Encoders for Localization Tasks

TSP: Temporally-Sensitive Pretraining of Video Encoders for Localization Tasks [Paper] [Project Website] This repository holds the source code, pretra

Humam Alwassel 83 Dec 21, 2022
Ensembling Off-the-shelf Models for GAN Training

Vision-aided GAN video (3m) | website | paper Can the collective knowledge from a large bank of pretrained vision models be leveraged to improve GAN t

345 Dec 28, 2022
Sample Prior Guided Robust Model Learning to Suppress Noisy Labels

PGDF This repo is the official implementation of our paper "Sample Prior Guided Robust Model Learning to Suppress Noisy Labels ". Citation If you use

CVSM Group - email: <a href=[email protected]"> 22 Dec 23, 2022
✂️ EyeLipCropper is a Python tool to crop eyes and mouth ROIs of the given video.

EyeLipCropper EyeLipCropper is a Python tool to crop eyes and mouth ROIs of the given video. The whole process consists of three parts: frame extracti

Zi-Han Liu 9 Oct 25, 2022
Decorator for PyMC3

sampled Decorator for reusable models in PyMC3 Provides syntactic sugar for reusable models with PyMC3. This lets you separate creating a generative m

Colin 50 Oct 08, 2021
clustering moroccan stocks time series data using k-means with dtw (dynamic time warping)

Moroccan Stocks Clustering Context Hey! we don't always have to forecast time series am I right ? We use k-means to cluster about 70 moroccan stock pr

Ayman Lafaz 7 Oct 18, 2022