DLWP: Deep Learning Weather Prediction

Overview

DLWP: Deep Learning Weather Prediction

DLWP is a Python project containing data-processing and model-building tools for predicting the gridded atmosphere using deep convolutional neural networks.

Reference

If you use this code or find it useful please cite our publication!

Getting started

For now, DLWP is not a package that can be installed using pip or a setup.py file, so it works like most research code: download (or checkout) and run.

Required dependencies

It is assumed that the following are installed using Anaconda Python 3 (Python 2.7 is supported).

  • TensorFlow (GPU capable version highly recommended). The conda package, while not the recommended installation method, is easy and also installs the required CUDA dependencies. For best performance, follow the instructions for installing from source.
    conda install tensorflow-gpu
  • Keras
    pip install keras
  • netCDF4
    conda install netCDF4
  • xarray
    conda install dask xarray

Optional dependencies

The following are required only for some of the DLWP features:

  • PyTorch: for torch-based deep learning models. Again the GPU-ready version is recommended.
    pip install torch torchvision
  • scikit-learn: for machine learning pre-processing tools such as Scalers and Imputers
    conda install scikit-learn
  • scipy: for CFS data interpolation
  • pygrib: for raw CFS data processing
    pip install pygrib
  • cdsapi: for retrieval of ERA5 data
    pip install cdsapi
  • pyspharm: spherical harmonics transforms for the barotropic model
    conda install -c conda-forge pyspharm

Quick overview

General framework

DLWP is built as a weather forecasting model that can, should performance improve greatly, "replace" and existing global weather or climate model. Essentially, this means that DLWP uses a deep convolutional neural network to map the state of the atmosphere at one time to the entire state of the atmophere at the next available time. A continuous forecast can then be made by feeding the model's predicted state back in as inputs, producing indefinite forecasts.

Data processing

The classes in DLWP.data provide tools for retrieving and processing raw data from the CFS reanalysis and reforecast and the ERA5 reanalysis. Meanwhile, the DLWP.model.preprocessing module provides tools for formatting the data for ingestion into the deep learning models. The following examples retrieve and process data from the CFS reanalysis:

  • examples/write_cfs.py
  • examples/write_cfs_predictors.py

The resulting file of predictor data can be ingested into the data generators for the models.

Keras models

The DLWP.model module contains classes for building and training Keras and PyTorch models. The DLWPNeuralNet class is essentially a wrapper for the simple Keras Sequential model, adding optional run-time scaling and imputing of data. It implements a few key methods:

  • build_model: use a custom API to assemble layers in a Sequential model. Also implements models running on multiple GPUs.
  • fit: scale the data and fit the model
  • fit_generator: use the Keras fit_generator method along with a custom data generator (see section below)
  • predict: predict with the model
  • predict_timeseries: predict a continuous time series forecast, where the output of one prediction iteration is used as the input for the next

An example of a model built and trained with the DLWP APIs using data generated by the DLWP processing methods, see examples/train.py.

DLWP also implements a DLWPFunctional class which implements the same methods as the DLWPNeuralNet class but takes as input to build_model a model assembled using the Keras functional API. For an example of training a functional model, see examples/train_functional.py.

PyTorch models

Currently, due to a focus on TensorFlow/Keras models, the PyTorch implementation in DLWP is more limited, although still robust. Like the Keras models, it implements a convenient build_model method to assemble a sequential-like model using the same API parameters as those for DLWPNeuralNet. Additionally, it also implements a fit method to automatically iterate through the data and optimizer, again, just like the Keras API.

The PyTorch example, train_torch.py, is somewhat outdated and uses the spherical convolution library s2cnn. This method has yet to produce good results.

Custom layers and functions

The DLWP.custom module contains many custom layers specifically for applying convolutional neural networks to the global weather prediction problem. For example, PeriodicPadding2D implements periodic boundary conditions for padding data in space prior to applying convolutions. These custom layers are worth a look.

Data generators

DLWP.model.generators contains several classes for generating data on-the-fly from a netCDF file produced by the DLWP preprocessing methods. These data generators can then be used in conjunction with a DWLP model instance's fit_generator method.

  • The DataGenerator class is the simplest generator class. It merely returns batches of data from a file containing "predictors" and "targets" variables already formatted for use in the DLWP model. Due to this simplicity, this is the optimal way to generate data directly from the disk when system memory is not sufficient to load the entire dataset. However, this comes at the cost of generating very large files on disk with redundant data (since the targets are merely a different time shift of the predictors).
  • The SeriesDataGenerator class is much more robust and memory efficient. It expects only a single "predictors" variable in the input file and generates predictor-target pairs on the fly for each batch of data. It also has the ability to prescribe external fields such as incoming solar radiation.
  • The SmartDataGenerator is deprecated in favor of SeriesDataGenerator.

Advanced forecast tools

The DLWP.model module also contains a TimeSeriesEstimator class. This class can be used to make robust forward forecasts where the data input does not necessarily match the data output of a model. And example usage of this class is in examples/validate.py, which performs basic routines to validate the forecast skill of DLWP models.

Other

The DLWP.util module contains useful utilities, including save_model and load_model for saving and loading DLWP models (and correctly dealing with multi-GPU models).

Owner
Kushal Shingote
Android Developer📱📱 iOS Apps📱📱 Swift | Xcode | SwiftUI iOS Swift development📱 Kotlin Application📱📱 iOS📱 Artificial Intelligence 💻 Data science
Kushal Shingote
Out-of-Domain Human Mesh Reconstruction via Dynamic Bilevel Online Adaptation

DynaBOA Code repositoty for the paper: Out-of-Domain Human Mesh Reconstruction via Dynamic Bilevel Online Adaptation Shanyan Guan, Jingwei Xu, Michell

198 Dec 29, 2022
Real-time multi-object tracker using YOLO v5 and deep sort

This repository contains a two-stage-tracker. The detections generated by YOLOv5, a family of object detection architectures and models pretrained on the COCO dataset, are passed to a Deep Sort algor

Mike 3.6k Jan 05, 2023
Offline Reinforcement Learning with Implicit Q-Learning

Offline Reinforcement Learning with Implicit Q-Learning This repository contains the official implementation of Offline Reinforcement Learning with Im

Ilya Kostrikov 126 Jan 06, 2023
Implementation of ETSformer, state of the art time-series Transformer, in Pytorch

ETSformer - Pytorch Implementation of ETSformer, state of the art time-series Transformer, in Pytorch Install $ pip install etsformer-pytorch Usage im

Phil Wang 121 Dec 30, 2022
Ankou: Guiding Grey-box Fuzzing towards Combinatorial Difference

Ankou Ankou is a source-based grey-box fuzzer. It intends to use a more rich fitness function by going beyond simple branch coverage and considering t

SoftSec Lab 54 Dec 24, 2022
AdaFocus V2: End-to-End Training of Spatial Dynamic Networks for Video Recognition

AdaFocusV2 This repo contains the official code and pre-trained models for AdaFo

79 Dec 26, 2022
TreeSubstitutionCipher - Encryption system based on trees and substitution

Tree Substitution Cipher Generation Algorithm: Generate random tree. Tree nodes

stepa 1 Jan 08, 2022
Code release for "Transferable Semantic Augmentation for Domain Adaptation" (CVPR 2021)

Transferable Semantic Augmentation for Domain Adaptation Code release for "Transferable Semantic Augmentation for Domain Adaptation" (CVPR 2021) Paper

66 Dec 16, 2022
Vehicle direction identification consists of three module detection , tracking and direction recognization.

Vehicle-direction-identification Vehicle direction identification consists of three module detection , tracking and direction recognization. Algorithm

5 Nov 15, 2022
Code for NeurIPS 2020 article "Contrastive learning of global and local features for medical image segmentation with limited annotations"

Contrastive learning of global and local features for medical image segmentation with limited annotations The code is for the article "Contrastive lea

Krishna Chaitanya 152 Dec 22, 2022
Code for CVPR 2021 oral paper "Exploring Data-Efficient 3D Scene Understanding with Contrastive Scene Contexts"

Exploring Data-Efficient 3D Scene Understanding with Contrastive Scene Contexts The rapid progress in 3D scene understanding has come with growing dem

Facebook Research 182 Dec 30, 2022
Pocsploit is a lightweight, flexible and novel open source poc verification framework

Pocsploit is a lightweight, flexible and novel open source poc verification framework

cckuailong 208 Dec 24, 2022
Official Implementation of "LUNAR: Unifying Local Outlier Detection Methods via Graph Neural Networks"

LUNAR Official Implementation of "LUNAR: Unifying Local Outlier Detection Methods via Graph Neural Networks" Adam Goodge, Bryan Hooi, Ng See Kiong and

Adam Goodge 25 Dec 28, 2022
Deep Two-View Structure-from-Motion Revisited

Deep Two-View Structure-from-Motion Revisited This repository provides the code for our CVPR 2021 paper Deep Two-View Structure-from-Motion Revisited.

Jianyuan Wang 145 Jan 06, 2023
exponential adaptive pooling for PyTorch

AdaPool: Exponential Adaptive Pooling for Information-Retaining Downsampling Abstract Pooling layers are essential building blocks of Convolutional Ne

Alexandros Stergiou 55 Jan 04, 2023
Code accompanying the paper Shared Independent Component Analysis for Multi-subject Neuroimaging

ShICA Code accompanying the paper Shared Independent Component Analysis for Multi-subject Neuroimaging Install Move into the ShICA directory cd ShICA

8 Nov 07, 2022
This repository holds code and data for our PETS'22 article 'From "Onion Not Found" to Guard Discovery'.

From "Onion Not Found" to Guard Discovery (PETS'22) This repository holds the code and data for our PETS'22 paper titled 'From "Onion Not Found" to Gu

Lennart Oldenburg 3 May 04, 2022
This repository contains the source code for the paper First Order Motion Model for Image Animation

!!! Check out our new paper and framework improved for articulated objects First Order Motion Model for Image Animation This repository contains the s

13k Jan 09, 2023
The official implementation of the IEEE S&P`22 paper "SoK: How Robust is Deep Neural Network Image Classification Watermarking".

Watermark-Robustness-Toolbox - Official PyTorch Implementation This repository contains the official PyTorch implementation of the following paper to

49 Dec 19, 2022
Pomodoro timer that acknowledges the inexorable, infinite passage of time

Pomodouroboros Most pomodoro trackers assume you're going to start them. But time and tide wait for no one - the great pomodoro of the cosmos is cold

Glyph 66 Dec 13, 2022