Deep learning PyTorch library for time series forecasting, classification, and anomaly detection

Overview

Deep learning for time series forecasting

Example image Flow forecast is an open-source deep learning for time series forecasting framework. It provides all the latest state of the art models (transformers, attention models, GRUs) and cutting edge concepts with easy to understand interpretability metrics, cloud provider integration, and model serving capabilities. Flow Forecast was the first time series framework to feature support for transformer based models and remains the only true end-to-end deep learnig for time series forecasting framework. Currently Task-TS from CoronaWhy primarily maintains this repository. Pull requests are welcome. Historically, this repository provided open source benchmark and codes for flash flood and river flow forecasting.

For additional tutorials (on Colab) and examples please see our tutorials repository.

branch status
master CircleCI
Build PY Upload Python Package
Documentation Documentation Status
CodeCov codecov
CodeFactor CodeFactor

Getting Started

Using the library

  1. Run pip install flood-forecast
  2. Detailed info on training models can be found on the Wiki.
  3. Check out our Confluence Documentation

Models currently supported

  1. Vanilla LSTM (LSTM): A basic LSTM that is suitable for multivariate time series forecasting and transfer learning.
  2. Full transformer (SimpleTransformer in model_dict): The full original transformer with all 8 encoder and decoder blocks. Requires passing the target in at inference.
  3. Simple Multi-Head Attention (MultiHeadSimple): A simple multi-head attention block and linear embedding layers. Suitable for transfer learning.
  4. Transformer with a linear decoder (CustomTransformerDecoder in model_dict): A transformer with n-encoder blocks (this is tunable) and a linear decoder.
  5. DA-RNN: (DARNN) A well rounded model with which utilizes a LSTM + attention.
  6. Enhancing the Locality and Breaking the Memory Bottleneck of Transformer on Time Series Forecasting (called DecoderTransformer in model_dict):
  7. Transformer XL:
  8. Informer: Beyond Efficient Transformer for Long Sequence Time-Series Forecasting (Informer)
  9. DeepAR

Forthcoming Models

We have a number of models we are planning on releasing soon. Please check our project board for more info

Integrations

Google Cloud Platform

Weights and Biases

Contributing

For instructions on contributing please see our contributions page and our project board.

Historical River Flow Data

Task 1 Stream Flow Forecasting

This task focuses on forecasting a stream's future flow/height (in either cfs or feet respectively) given factors such as current flow, temperature, and precipitation. In the future we plan on adding more variables that help with the stream flow prediction such as snow pack data and the surrounding soil moisture index.

Task 2 Flood severity forecasting

Task two focuses on predicting the severity of the flood based on the flood forecast, population information, and topography. Flood severity is defined based on several factors including the number of injuires, property damage, and crop damage.

If you use either the data or code from this repository please use the citation below. Additionally please cite the original authors of the models.

@misc{godfried2020flowdb,
      title={FlowDB a large scale precipitation, river, and flash flood dataset}, 
      author={Isaac Godfried and Kriti Mahajan and Maggie Wang and Kevin Li and Pranjalya Tiwari},
      year={2020},
      eprint={2012.11154},
      archivePrefix={arXiv},
      primaryClass={cs.AI}
}
Comments
  • Informer compatibility with interpretability methods

    Informer compatibility with interpretability methods

    Currently Informer does not work with the shap interpretability methods. Refactoring SHAP to work with these methods will likely require so significant refactoring. As with Informer we have the target being passed. We should also likely design a helper function to better help with this. history, _, forecast_start_idx = csv_test_loader.get_from_start_date(datetime_start) background_tensor = _prepare_background_tensor(csv_test_loader)

    enhancement 
    opened by isaacmg 7
  • Inference mode for time series models

    Inference mode for time series models

    Create a predict function which does inference for for time series models without requiring the target present. This module should initialize the model using the given configuration file (with a weight path). It should be able to consume a CSV file or query a SQL table #102 (thought this functionality is not required in the initial PR). It should ideally make use of the existing evaluator.py module but

    Acceptance Criteria

    • [ ] Passing tests
    deployment 
    opened by isaacmg 7
  • about dataset

    about dataset

    So how can i download the dataset of FlowDB Dataset? Gsutil is not working? Can you give some details for your dataset, and tell us how to use you model for gour FlowDB? Thanks!

    opened by Vipermdl 6
  • Does datetime_start parameter in inference_params is forecasting start date?

    Does datetime_start parameter in inference_params is forecasting start date?

    In your Infer.ipynb datetime_start parameter is forecasting start date? (Your predict_cfs bucknet had been expired.)

    'inference_params': {'dataset_params': {'file_path': 'gs://predict_cfs/day_addition/01064118KPWM_flow.csv', 'forecast_history': 8, 'forecast_length': 1, 'interpolate_param': {'method': 'back_forward', 'params': {}}, 'relevant_cols': ['cfs1', 'precip', 'temp', 'month'], 'scaling': RobustScaler(), 'sort_column': 'hour_updated', 'target_col': ['cfs1']}, 'datetime_start': '2018-05-31', 'decoder_params': {'decoder_function': 'simple_decode', 'unsqueeze_dim': 1}, 'hours_to_forecast': 336, 'num_prediction_samples': 30, 'test_csv_path': 'gs://predict_cfs/day_addition/01064118KPWM_flow.csv'}

    opened by JJNET 5
  • Poor informer performance

    Poor informer performance

    The performance of the Informer model still seems to be poor at least with respect to forecast the Virgin River Flow. There may still be bugs therefore we should investigate it on other datasets and additional unittests. Possibly we should also try to replicate the performance on the ETH datasets the model was trained on (related to #314 ) The model does not seem to learn anything from the temporal data input.

    opened by isaacmg 5
  • Adding GPU support to the Informer

    Adding GPU support to the Informer

    This PR aims to the resolve prior issues #343 as well as fix a new problem related to the label_len in the data-loader. This PR in addition includes documentation updates to the Informer and additional information on how to use relevant data-loaders and SHAP features.

    opened by isaacmg 5
  • DecoderTransformer: Distinguishing Know inputs from Observed inputs

    DecoderTransformer: Distinguishing Know inputs from Observed inputs

    Hello Isaac. First of all thank you for this brilliant project. I was able to run the Decoder Transformer on the EU Wind Energy dataset.

    One question though. The model's paper, when defining the problem, says that some exogenous time series are known until the forecast horizon. For example, I would like to add the wind forecast as a feature with a middle dimension equals to "forecast_length" and with the same time idx as the target. Is there a way to model this in your config_file or at a lower level within the Loader objects?

    Thank you

    Lorenzo Ostano

    opened by Vergangenheit 5
  • TypeError: Object of type Tensor is not JSON serializable when running train_transformer_style with takes_target as 1

    TypeError: Object of type Tensor is not JSON serializable when running train_transformer_style with takes_target as 1

    Traceback (most recent call last): File "flood_forecast/trainer.py", line 108, in main() File "flood_forecast/trainer.py", line 103, in main train_function(training_config["model_type"], training_config) File "flood_forecast/trainer.py", line 42, in train_function train_transformer_style(model=trained_model, File "/home/harsh/Documents/Coronawhy/flow-forecast/flood_forecast/pytorch_training.py", line 146, in train_transformer_style model.save_model(model_filepath, max_epochs) File "/home/harsh/Documents/Coronawhy/flow-forecast/flood_forecast/time_model.py", line 152, in save_model json.dump(self.params, p) File "/home/harsh/anaconda3/envs/flow-forecast/lib/python3.8/json/init.py", line 179, in dump for chunk in iterable: File "/home/harsh/anaconda3/envs/flow-forecast/lib/python3.8/json/encoder.py", line 431, in _iterencode yield from _iterencode_dict(o, _current_indent_level) File "/home/harsh/anaconda3/envs/flow-forecast/lib/python3.8/json/encoder.py", line 405, in _iterencode_dict yield from chunks File "/home/harsh/anaconda3/envs/flow-forecast/lib/python3.8/json/encoder.py", line 405, in _iterencode_dict yield from chunks File "/home/harsh/anaconda3/envs/flow-forecast/lib/python3.8/json/encoder.py", line 438, in _iterencode o = _default(o) File "/home/harsh/anaconda3/envs/flow-forecast/lib/python3.8/json/encoder.py", line 179, in default raise TypeError(f'Object of type {o.class.name} ' TypeError: Object of type Tensor is not JSON serializable

    opened by 97harsh 5
  • Add meta-data fusion method and documentation

    Add meta-data fusion method and documentation

    Based on #100 we want to fuse meta-data with temporal data to enable better time series forecasts.

    • [x] Create a design document of meta-data fusion methods and explain relevant approaches
    • [x] Review design document with @kritim13 and other teammates.
    • [x] Implement agreed upon approach
    • [x] Create a JSON config file and appropriate unit tests.
    • [x] Test end to end in the Kaggle Notebook.
    meta-data 
    opened by isaacmg 5
  • Get ASOS data on GCS for years 2014-2019

    Get ASOS data on GCS for years 2014-2019

    Get all the data on GCS for those dates.

    • [x] Create looping function to perform action
    • [x] Create list of ASOS stations already saved with path on GCS. Upload this file to GCS.
    • [x] Run and get all data on GCS for all gages
    opened by isaacmg 5
  • DecoderTransformer not implemented as paper at all

    DecoderTransformer not implemented as paper at all

    did I miss something? The decodertransformer which claims to implement the paper(Enhancing the Locality and Breaking the Memory Bottleneck of Transformer on Time Series Forecasting) in the document is not even close to what the paper proposed. There are no key component like conv1d layers for locality and logsparse. If we didn't implement that paper, then really shouldn't list it in the document.

    opened by mvccn 4
  • Example auto-encoder time series

    Example auto-encoder time series

    We could use a detailed end-to-end example of using an AutoEncoder to create representations of temporal data. This should likely be done on Kaggle then added to the flow tutorials repo as a lin.

    documentation 
    opened by isaacmg 0
  • Pyre type error fixed.

    Pyre type error fixed.

    "filename": "flood_forecast/preprocessing/process_usgs.py" "warning_type": "Invalid type [31]" "warning_message": " Expression (pandas.DataFrame, int, int, int) is not a valid type." "warning_line": 82 "fix": remove int,int,int

    opened by luca-digrazia 0
  • Bump shap from 0.40.0 to 0.41.0

    Bump shap from 0.40.0 to 0.41.0

    Bumps shap from 0.40.0 to 0.41.0.

    Release notes

    Sourced from shap's releases.

    v0.41.0

    Lots of bugs fixes and API improvements.

    Commits
    • 510c4b6 Merge pull request #2242 from ravwojdyla/allow-to-control-the-heatmap-size
    • dd967b6 Merge branch 'master' of https://github.com/slundberg/shap
    • a791685 fix std to account for averaging
    • 6995c03 Merge branch 'master' into allow-to-control-the-heatmap-size
    • b6e90c8 Merge pull request #2580 from alexisdrakopoulos/feat/refactor_exceptions
    • 4921c50 Merge pull request #2162 from TheZL/xgbmodel_buffer_lstrip_error_correction
    • a8dbefd Clean up the intro doc notebook
    • 84ddd09 Merge branch 'feat/refactor_exceptions' of github.com:alexisdrakopoulos/shap ...
    • 348dc7d accidental import
    • 2cfa489 Merge branch 'master' into xgbmodel_buffer_lstrip_error_correction
    • Additional commits viewable in compare view

    Dependabot compatibility score

    Dependabot will resolve any conflicts with this PR as long as you don't alter it yourself. You can also trigger a rebase manually by commenting @dependabot rebase.


    Dependabot commands and options

    You can trigger Dependabot actions by commenting on this PR:

    • @dependabot rebase will rebase this PR
    • @dependabot recreate will recreate this PR, overwriting any edits that have been made to it
    • @dependabot merge will merge this PR after your CI passes on it
    • @dependabot squash and merge will squash and merge this PR after your CI passes on it
    • @dependabot cancel merge will cancel a previously requested merge and block automerging
    • @dependabot reopen will reopen this PR if it is closed
    • @dependabot close will close this PR and stop Dependabot recreating it. You can achieve the same result by closing it manually
    • @dependabot ignore this major version will close this PR and stop Dependabot creating any more for this major version (unless you reopen the PR or upgrade to it yourself)
    • @dependabot ignore this minor version will close this PR and stop Dependabot creating any more for this minor version (unless you reopen the PR or upgrade to it yourself)
    • @dependabot ignore this dependency will close this PR and stop Dependabot creating any more for this dependency (unless you reopen the PR or upgrade to it yourself)
    dependencies 
    opened by dependabot[bot] 0
Releases(FF_FIXES_BRANCH_VER)
Owner
AIStream
AIStream develops open source deep learning solutions for real world problems
AIStream
CVPR2020 Counterfactual Samples Synthesizing for Robust VQA

CVPR2020 Counterfactual Samples Synthesizing for Robust VQA This repo contains code for our paper "Counterfactual Samples Synthesizing for Robust Visu

72 Dec 22, 2022
EMNLP'2021: Simple Entity-centric Questions Challenge Dense Retrievers

EntityQuestions This repository contains the EntityQuestions dataset as well as code to evaluate retrieval results from the the paper Simple Entity-ce

Princeton Natural Language Processing 119 Sep 28, 2022
Code for Fold2Seq paper from ICML 2021

[ICML2021] Fold2Seq: A Joint Sequence(1D)-Fold(3D) Embedding-based Generative Model for Protein Design Environment file: environment.yml Data and Feat

International Business Machines 43 Dec 04, 2022
Using knowledge-informed machine learning on the PRONOSTIA (FEMTO) and IMS bearing data sets. Predict remaining-useful-life (RUL).

Knowledge Informed Machine Learning using a Weibull-based Loss Function Exploring the concept of knowledge-informed machine learning with the use of a

Tim 43 Dec 14, 2022
Authors implementation of LieTransformer: Equivariant Self-Attention for Lie Groups

LieTransformer This repository contains the implementation of the LieTransformer used for experiments in the paper LieTransformer: Equivariant self-at

35 Oct 18, 2022
A library for researching neural networks compression and acceleration methods.

A library for researching neural networks compression and acceleration methods.

Intel Labs 100 Dec 29, 2022
Data, model training, and evaluation code for "PubTables-1M: Towards a universal dataset and metrics for training and evaluating table extraction models".

PubTables-1M This repository contains training and evaluation code for the paper "PubTables-1M: Towards a universal dataset and metrics for training a

Microsoft 365 Jan 04, 2023
GUPNet - Geometry Uncertainty Projection Network for Monocular 3D Object Detection

GUPNet This is the official implementation of "Geometry Uncertainty Projection Network for Monocular 3D Object Detection". citation If you find our wo

Yan Lu 103 Dec 28, 2022
TransPrompt - Towards an Automatic Transferable Prompting Framework for Few-shot Text Classification

TransPrompt This code is implement for our EMNLP 2021's paper 《TransPrompt:Towards an Automatic Transferable Prompting Framework for Few-shot Text Cla

WangJianing 23 Dec 21, 2022
Official PyTorch implementation of the paper: Improving Graph Neural Network Expressivity via Subgraph Isomorphism Counting.

Improving Graph Neural Network Expressivity via Subgraph Isomorphism Counting Official PyTorch implementation of the paper: Improving Graph Neural Net

Giorgos Bouritsas 58 Dec 31, 2022
基于PaddleOCR搭建的OCR server... 离线部署用

开头说明 DangoOCR 是基于大家的 CPU处理器 来运行的,CPU处理器 的好坏会直接影响其速度, 但不会影响识别的精度 ,目前此版本识别速度可能在 0.5-3秒之间,具体取决于大家机器的配置,可以的话尽量不要在运行时开其他太多东西。需要配合团子翻译器 Ver3.6 及其以上的版本才可以使用!

胖次团子 131 Dec 25, 2022
codes for Image Inpainting with External-internal Learning and Monochromic Bottleneck

Image Inpainting with External-internal Learning and Monochromic Bottleneck This repository is for the CVPR 2021 paper: 'Image Inpainting with Externa

97 Nov 29, 2022
Pomodoro timer that acknowledges the inexorable, infinite passage of time

Pomodouroboros Most pomodoro trackers assume you're going to start them. But time and tide wait for no one - the great pomodoro of the cosmos is cold

Glyph 66 Dec 13, 2022
Using Language Model to Bootstrap Human Activity Recognition Ambient Sensors Based in Smart Homes

Using Language Model to Bootstrap Human Activity Recognition Ambient Sensors Based in Smart Homes This repository is the official implementation of Us

Damien Bouchabou 0 Oct 18, 2021
Implementation of the "PSTNet: Point Spatio-Temporal Convolution on Point Cloud Sequences" paper.

PSTNet: Point Spatio-Temporal Convolution on Point Cloud Sequences Introduction Point cloud sequences are irregular and unordered in the spatial dimen

Hehe Fan 63 Dec 09, 2022
Approaches to modeling terrain and maps in python

topography 🌎 Contains different approaches to modeling terrain and topographic-style maps in python Features Inverse Distance Weighting (IDW) A given

John Gutierrez 1 Aug 10, 2022
A curated list of awesome Machine Learning frameworks, libraries and software.

Awesome Machine Learning A curated list of awesome machine learning frameworks, libraries and software (by language). Inspired by awesome-php. If you

Joseph Misiti 57.1k Jan 03, 2023
[NeurIPS'21] Projected GANs Converge Faster

[Project] [PDF] [Supplementary] [Talk] This repository contains the code for our NeurIPS 2021 paper "Projected GANs Converge Faster" by Axel Sauer, Ka

798 Jan 04, 2023
Occlusion robust 3D face reconstruction model in CFR-GAN (WACV 2022)

Occlusion Robust 3D face Reconstruction Yeong-Joon Ju, Gun-Hee Lee, Jung-Ho Hong, and Seong-Whan Lee Code for Occlusion Robust 3D Face Reconstruction

Yeongjoon 31 Dec 19, 2022
DI-smartcross - Decision Intelligence Platform for Traffic Crossing Signal Control

DI-smartcross DI-smartcross - Decision Intelligence Platform for Traffic Crossin

OpenDILab 213 Jan 02, 2023