Deep learning PyTorch library for time series forecasting, classification, and anomaly detection

Overview

Deep learning for time series forecasting

Example image Flow forecast is an open-source deep learning for time series forecasting framework. It provides all the latest state of the art models (transformers, attention models, GRUs) and cutting edge concepts with easy to understand interpretability metrics, cloud provider integration, and model serving capabilities. Flow Forecast was the first time series framework to feature support for transformer based models and remains the only true end-to-end deep learnig for time series forecasting framework. Currently Task-TS from CoronaWhy primarily maintains this repository. Pull requests are welcome. Historically, this repository provided open source benchmark and codes for flash flood and river flow forecasting.

For additional tutorials (on Colab) and examples please see our tutorials repository.

branch status
master CircleCI
Build PY Upload Python Package
Documentation Documentation Status
CodeCov codecov
CodeFactor CodeFactor

Getting Started

Using the library

  1. Run pip install flood-forecast
  2. Detailed info on training models can be found on the Wiki.
  3. Check out our Confluence Documentation

Models currently supported

  1. Vanilla LSTM (LSTM): A basic LSTM that is suitable for multivariate time series forecasting and transfer learning.
  2. Full transformer (SimpleTransformer in model_dict): The full original transformer with all 8 encoder and decoder blocks. Requires passing the target in at inference.
  3. Simple Multi-Head Attention (MultiHeadSimple): A simple multi-head attention block and linear embedding layers. Suitable for transfer learning.
  4. Transformer with a linear decoder (CustomTransformerDecoder in model_dict): A transformer with n-encoder blocks (this is tunable) and a linear decoder.
  5. DA-RNN: (DARNN) A well rounded model with which utilizes a LSTM + attention.
  6. Enhancing the Locality and Breaking the Memory Bottleneck of Transformer on Time Series Forecasting (called DecoderTransformer in model_dict):
  7. Transformer XL:
  8. Informer: Beyond Efficient Transformer for Long Sequence Time-Series Forecasting (Informer)
  9. DeepAR

Forthcoming Models

We have a number of models we are planning on releasing soon. Please check our project board for more info

Integrations

Google Cloud Platform

Weights and Biases

Contributing

For instructions on contributing please see our contributions page and our project board.

Historical River Flow Data

Task 1 Stream Flow Forecasting

This task focuses on forecasting a stream's future flow/height (in either cfs or feet respectively) given factors such as current flow, temperature, and precipitation. In the future we plan on adding more variables that help with the stream flow prediction such as snow pack data and the surrounding soil moisture index.

Task 2 Flood severity forecasting

Task two focuses on predicting the severity of the flood based on the flood forecast, population information, and topography. Flood severity is defined based on several factors including the number of injuires, property damage, and crop damage.

If you use either the data or code from this repository please use the citation below. Additionally please cite the original authors of the models.

@misc{godfried2020flowdb,
      title={FlowDB a large scale precipitation, river, and flash flood dataset}, 
      author={Isaac Godfried and Kriti Mahajan and Maggie Wang and Kevin Li and Pranjalya Tiwari},
      year={2020},
      eprint={2012.11154},
      archivePrefix={arXiv},
      primaryClass={cs.AI}
}
Comments
  • Informer compatibility with interpretability methods

    Informer compatibility with interpretability methods

    Currently Informer does not work with the shap interpretability methods. Refactoring SHAP to work with these methods will likely require so significant refactoring. As with Informer we have the target being passed. We should also likely design a helper function to better help with this. history, _, forecast_start_idx = csv_test_loader.get_from_start_date(datetime_start) background_tensor = _prepare_background_tensor(csv_test_loader)

    enhancement 
    opened by isaacmg 7
  • Inference mode for time series models

    Inference mode for time series models

    Create a predict function which does inference for for time series models without requiring the target present. This module should initialize the model using the given configuration file (with a weight path). It should be able to consume a CSV file or query a SQL table #102 (thought this functionality is not required in the initial PR). It should ideally make use of the existing evaluator.py module but

    Acceptance Criteria

    • [ ] Passing tests
    deployment 
    opened by isaacmg 7
  • about dataset

    about dataset

    So how can i download the dataset of FlowDB Dataset? Gsutil is not working? Can you give some details for your dataset, and tell us how to use you model for gour FlowDB? Thanks!

    opened by Vipermdl 6
  • Does datetime_start parameter in inference_params is forecasting start date?

    Does datetime_start parameter in inference_params is forecasting start date?

    In your Infer.ipynb datetime_start parameter is forecasting start date? (Your predict_cfs bucknet had been expired.)

    'inference_params': {'dataset_params': {'file_path': 'gs://predict_cfs/day_addition/01064118KPWM_flow.csv', 'forecast_history': 8, 'forecast_length': 1, 'interpolate_param': {'method': 'back_forward', 'params': {}}, 'relevant_cols': ['cfs1', 'precip', 'temp', 'month'], 'scaling': RobustScaler(), 'sort_column': 'hour_updated', 'target_col': ['cfs1']}, 'datetime_start': '2018-05-31', 'decoder_params': {'decoder_function': 'simple_decode', 'unsqueeze_dim': 1}, 'hours_to_forecast': 336, 'num_prediction_samples': 30, 'test_csv_path': 'gs://predict_cfs/day_addition/01064118KPWM_flow.csv'}

    opened by JJNET 5
  • Poor informer performance

    Poor informer performance

    The performance of the Informer model still seems to be poor at least with respect to forecast the Virgin River Flow. There may still be bugs therefore we should investigate it on other datasets and additional unittests. Possibly we should also try to replicate the performance on the ETH datasets the model was trained on (related to #314 ) The model does not seem to learn anything from the temporal data input.

    opened by isaacmg 5
  • Adding GPU support to the Informer

    Adding GPU support to the Informer

    This PR aims to the resolve prior issues #343 as well as fix a new problem related to the label_len in the data-loader. This PR in addition includes documentation updates to the Informer and additional information on how to use relevant data-loaders and SHAP features.

    opened by isaacmg 5
  • DecoderTransformer: Distinguishing Know inputs from Observed inputs

    DecoderTransformer: Distinguishing Know inputs from Observed inputs

    Hello Isaac. First of all thank you for this brilliant project. I was able to run the Decoder Transformer on the EU Wind Energy dataset.

    One question though. The model's paper, when defining the problem, says that some exogenous time series are known until the forecast horizon. For example, I would like to add the wind forecast as a feature with a middle dimension equals to "forecast_length" and with the same time idx as the target. Is there a way to model this in your config_file or at a lower level within the Loader objects?

    Thank you

    Lorenzo Ostano

    opened by Vergangenheit 5
  • TypeError: Object of type Tensor is not JSON serializable when running train_transformer_style with takes_target as 1

    TypeError: Object of type Tensor is not JSON serializable when running train_transformer_style with takes_target as 1

    Traceback (most recent call last): File "flood_forecast/trainer.py", line 108, in main() File "flood_forecast/trainer.py", line 103, in main train_function(training_config["model_type"], training_config) File "flood_forecast/trainer.py", line 42, in train_function train_transformer_style(model=trained_model, File "/home/harsh/Documents/Coronawhy/flow-forecast/flood_forecast/pytorch_training.py", line 146, in train_transformer_style model.save_model(model_filepath, max_epochs) File "/home/harsh/Documents/Coronawhy/flow-forecast/flood_forecast/time_model.py", line 152, in save_model json.dump(self.params, p) File "/home/harsh/anaconda3/envs/flow-forecast/lib/python3.8/json/init.py", line 179, in dump for chunk in iterable: File "/home/harsh/anaconda3/envs/flow-forecast/lib/python3.8/json/encoder.py", line 431, in _iterencode yield from _iterencode_dict(o, _current_indent_level) File "/home/harsh/anaconda3/envs/flow-forecast/lib/python3.8/json/encoder.py", line 405, in _iterencode_dict yield from chunks File "/home/harsh/anaconda3/envs/flow-forecast/lib/python3.8/json/encoder.py", line 405, in _iterencode_dict yield from chunks File "/home/harsh/anaconda3/envs/flow-forecast/lib/python3.8/json/encoder.py", line 438, in _iterencode o = _default(o) File "/home/harsh/anaconda3/envs/flow-forecast/lib/python3.8/json/encoder.py", line 179, in default raise TypeError(f'Object of type {o.class.name} ' TypeError: Object of type Tensor is not JSON serializable

    opened by 97harsh 5
  • Add meta-data fusion method and documentation

    Add meta-data fusion method and documentation

    Based on #100 we want to fuse meta-data with temporal data to enable better time series forecasts.

    • [x] Create a design document of meta-data fusion methods and explain relevant approaches
    • [x] Review design document with @kritim13 and other teammates.
    • [x] Implement agreed upon approach
    • [x] Create a JSON config file and appropriate unit tests.
    • [x] Test end to end in the Kaggle Notebook.
    meta-data 
    opened by isaacmg 5
  • Get ASOS data on GCS for years 2014-2019

    Get ASOS data on GCS for years 2014-2019

    Get all the data on GCS for those dates.

    • [x] Create looping function to perform action
    • [x] Create list of ASOS stations already saved with path on GCS. Upload this file to GCS.
    • [x] Run and get all data on GCS for all gages
    opened by isaacmg 5
  • DecoderTransformer not implemented as paper at all

    DecoderTransformer not implemented as paper at all

    did I miss something? The decodertransformer which claims to implement the paper(Enhancing the Locality and Breaking the Memory Bottleneck of Transformer on Time Series Forecasting) in the document is not even close to what the paper proposed. There are no key component like conv1d layers for locality and logsparse. If we didn't implement that paper, then really shouldn't list it in the document.

    opened by mvccn 4
  • Example auto-encoder time series

    Example auto-encoder time series

    We could use a detailed end-to-end example of using an AutoEncoder to create representations of temporal data. This should likely be done on Kaggle then added to the flow tutorials repo as a lin.

    documentation 
    opened by isaacmg 0
  • Pyre type error fixed.

    Pyre type error fixed.

    "filename": "flood_forecast/preprocessing/process_usgs.py" "warning_type": "Invalid type [31]" "warning_message": " Expression (pandas.DataFrame, int, int, int) is not a valid type." "warning_line": 82 "fix": remove int,int,int

    opened by luca-digrazia 0
  • Bump shap from 0.40.0 to 0.41.0

    Bump shap from 0.40.0 to 0.41.0

    Bumps shap from 0.40.0 to 0.41.0.

    Release notes

    Sourced from shap's releases.

    v0.41.0

    Lots of bugs fixes and API improvements.

    Commits
    • 510c4b6 Merge pull request #2242 from ravwojdyla/allow-to-control-the-heatmap-size
    • dd967b6 Merge branch 'master' of https://github.com/slundberg/shap
    • a791685 fix std to account for averaging
    • 6995c03 Merge branch 'master' into allow-to-control-the-heatmap-size
    • b6e90c8 Merge pull request #2580 from alexisdrakopoulos/feat/refactor_exceptions
    • 4921c50 Merge pull request #2162 from TheZL/xgbmodel_buffer_lstrip_error_correction
    • a8dbefd Clean up the intro doc notebook
    • 84ddd09 Merge branch 'feat/refactor_exceptions' of github.com:alexisdrakopoulos/shap ...
    • 348dc7d accidental import
    • 2cfa489 Merge branch 'master' into xgbmodel_buffer_lstrip_error_correction
    • Additional commits viewable in compare view

    Dependabot compatibility score

    Dependabot will resolve any conflicts with this PR as long as you don't alter it yourself. You can also trigger a rebase manually by commenting @dependabot rebase.


    Dependabot commands and options

    You can trigger Dependabot actions by commenting on this PR:

    • @dependabot rebase will rebase this PR
    • @dependabot recreate will recreate this PR, overwriting any edits that have been made to it
    • @dependabot merge will merge this PR after your CI passes on it
    • @dependabot squash and merge will squash and merge this PR after your CI passes on it
    • @dependabot cancel merge will cancel a previously requested merge and block automerging
    • @dependabot reopen will reopen this PR if it is closed
    • @dependabot close will close this PR and stop Dependabot recreating it. You can achieve the same result by closing it manually
    • @dependabot ignore this major version will close this PR and stop Dependabot creating any more for this major version (unless you reopen the PR or upgrade to it yourself)
    • @dependabot ignore this minor version will close this PR and stop Dependabot creating any more for this minor version (unless you reopen the PR or upgrade to it yourself)
    • @dependabot ignore this dependency will close this PR and stop Dependabot creating any more for this dependency (unless you reopen the PR or upgrade to it yourself)
    dependencies 
    opened by dependabot[bot] 0
Releases(FF_FIXES_BRANCH_VER)
Owner
AIStream
AIStream develops open source deep learning solutions for real world problems
AIStream
FedMM: Saddle Point Optimization for Federated Adversarial Domain Adaptation

This repository contains the code accompanying the paper " FedMM: Saddle Point Optimization for Federated Adversarial Domain Adaptation" Paper link: R

20 Jun 29, 2022
This repo contains the implementation of YOLOv2 in Keras with Tensorflow backend.

Easy training on custom dataset. Various backends (MobileNet and SqueezeNet) supported. A YOLO demo to detect raccoon run entirely in brower is accessible at https://git.io/vF7vI (not on Windows).

Huynh Ngoc Anh 1.7k Dec 24, 2022
StackGAN: Text to Photo-realistic Image Synthesis with Stacked Generative Adversarial Networks

StackGAN Pytorch implementation Inception score evaluation StackGAN-v2-pytorch Tensorflow implementation for reproducing main results in the paper Sta

Han Zhang 1.8k Dec 21, 2022
Pytorch implementation of "Grad-TTS: A Diffusion Probabilistic Model for Text-to-Speech"

GradTTS Unofficial Pytorch implementation of "Grad-TTS: A Diffusion Probabilistic Model for Text-to-Speech" (arxiv) About this repo This is an unoffic

HeyangXue1997 103 Dec 23, 2022
Official PaddlePaddle implementation of Paint Transformer

Paint Transformer: Feed Forward Neural Painting with Stroke Prediction [Paper] [Paddle Implementation] Update We have optimized the serial inference p

TianweiLin 284 Dec 31, 2022
Local Multi-Head Channel Self-Attention for FER2013

LHC-Net Local Multi-Head Channel Self-Attention This repository is intended to provide a quick implementation of the LHC-Net and to replicate the resu

12 Jan 04, 2023
Optimized code based on M2 for faster image captioning training

Transformer Captioning This repository contains the code for Transformer-based image captioning. Based on meshed-memory-transformer, we further optimi

lyricpoem 16 Dec 16, 2022
[ACM MM2021] MGH: Metadata Guided Hypergraph Modeling for Unsupervised Person Re-identification

Introduction This project is developed based on FastReID, which is an ongoing ReID project. Projects BUC In projects/BUC, we implement AAAI 2019 paper

WuYiming 7 Apr 13, 2022
Train CPPNs as a Generative Model, using Generative Adversarial Networks and Variational Autoencoder techniques to produce high resolution images.

cppn-gan-vae tensorflow Train Compositional Pattern Producing Network as a Generative Model, using Generative Adversarial Networks and Variational Aut

hardmaru 343 Dec 29, 2022
Anomaly Detection Based on Hierarchical Clustering of Mobile Robot Data

We proposed a new approach to detect anomalies of mobile robot data. We investigate each data seperately with two clustering method hierarchical and k-means. There are two sub-method that we used for

Zekeriyya Demirci 1 Jan 09, 2022
Zsseg.baseline - Zero-Shot Semantic Segmentation

This repo is for our paper A Simple Baseline for Zero-shot Semantic Segmentation

98 Dec 20, 2022
A general 3D Object Detection codebase in PyTorch.

Det3D is the first 3D Object Detection toolbox which provides off the box implementations of many 3D object detection algorithms such as PointPillars, SECOND, PIXOR, etc, as well as state-of-the-art

Benjin Zhu 1.4k Jan 05, 2023
Bayesian Neural Networks in PyTorch

We present the new scheme to compute Monte Carlo estimator in Bayesian VI settings with almost no memory cost in GPU, regardles of the number of sampl

Jurijs Nazarovs 7 May 03, 2022
Solutions and questions for AoC2021. Merry christmas!

Advent of Code 2021 Merry christmas! 🎄 🎅 To get solutions and approximate execution times for implementations, please execute the run.py script in t

Wilhelm Ågren 5 Dec 29, 2022
[IROS'21] SurRoL: An Open-source Reinforcement Learning Centered and dVRK Compatible Platform for Surgical Robot Learning

SurRoL IROS 2021 SurRoL: An Open-source Reinforcement Learning Centered and dVRK Compatible Platform for Surgical Robot Learning Features dVRK compati

<a href=[email protected]"> 55 Jan 03, 2023
Source code for Fathony, Sahu, Willmott, & Kolter, "Multiplicative Filter Networks", ICLR 2021.

Multiplicative Filter Networks This repository contains a PyTorch MFN implementation and code to perform & reproduce experiments from the ICLR 2021 pa

Bosch Research 66 Jan 04, 2023
Video Frame Interpolation with Transformer (CVPR2022)

VFIformer Official PyTorch implementation of our CVPR2022 paper Video Frame Interpolation with Transformer Dependencies python = 3.8 pytorch = 1.8.0

DV Lab 63 Dec 16, 2022
ReGAN: Sequence GAN using RE[INFORCE|LAX|BAR] based PG estimators

Sequence Generation with GANs trained by Gradient Estimation Requirements: PyTorch v0.3 Python 3.6 CUDA 9.1 (For GPU) Origin The idea is from paper Se

40 Nov 03, 2022
This repository attempts to replicate the SqueezeNet architecture and implement the same on an image classification task.

SqueezeNet-Implementation This repository attempts to replicate the SqueezeNet architecture using TensorFlow discussed in the research paper: "Squeeze

Rohan Mathur 3 Dec 13, 2022
CCNet: Criss-Cross Attention for Semantic Segmentation (TPAMI 2020 & ICCV 2019).

CCNet: Criss-Cross Attention for Semantic Segmentation Paper Links: Our most recent TPAMI version with improvements and extensions (Earlier ICCV versi

Zilong Huang 1.3k Dec 27, 2022