Forecasting for knowable future events using Bayesian informative priors (forecasting with judgmental-adjustment).

Overview

What is judgyprophet?

judgyprophet is a Bayesian forecasting algorithm based on Prophet, that enables forecasting while using information known by the business about future events. The aim is to enable users to perform forecasting with judgmental adjustment, in a way that is mathematically as sound as possible.

Some events will have a big effect on your timeseries. Some of which you are aware of ahead of time. For example:

  • An existing product entering a new market.
  • A price change to a product.

These events will typically cause a large change in your timeseries of e.g. product sales, which a standard statistical forecast will consistently underestimate.

The business will often have good estimates (or at least better than your statistical forecast) about how these events will affect your timeseries. But this is difficult to encode into your statistical forecasting algorithm. One option is to use a regressor, but this typically works poorly. This is because you have no data on the event before it occurs, and the statistical forecast does not know how to balance the information in your regressor and trend after the event occurs (which can lead to erratic behaviour).

judgyprophet solves this problem by encoding the business estimate of how the event will affect the forecast (the judgmental adjustment) as a Bayesian informative prior.

Before the event occurs, this business information is used to reflect the forecast of what will happen post-event e.g. the estimated uplift in product sales once the event has happened. After the event occurs, we update what the business thinks will happen, with what we see happening in the actuals. This is done using standard Bayesian updating.

Installation

1. install judgyprophet python package using pip

pip install judgyprophet

2. compile the STAN model

judgyprophet depends on STAN, whose models have to be compiled before running.

So to use judgyprophet, you have to compile the model. Do this in the shell using

python -c "from judgyprophet import JudgyProphet; JudgyProphet().compile()"

or in python using

from judgyprophet import JudgyProphet

JudgyProphet().compile()

This will take a while. But you only have to run this once, after the initial install.

Documentation

Full documentation is available on our Github Pages site here.

Scroll down for a quickstart tutorial.

A runnable jupyter notebook version of the quickstart tutorial is available here

Roadmap

Some things on our roadmap:

  • Currently judgyprophet STAN file is only tested on Unix-based Linux or Mac machines. We aim to fully test Windows machines ASAP.
  • Option to run full MCMC, rather than just L-BFGS.
  • Prediction intervals
  • Regressors/holidays

Quickstart Tutorial

Imagine your business currently operates in the US, but is launching its product in Europe. As a result it anticipates a sharp uptake in sales (which it has an estimate of). As your forecasting team, they come to you and ask you to account for this.

Let's look at how we might do this using judgyprophet with some example data, where we know what happened. First let's plot this:

from judgyprophet.tutorials.resources import get_trend_event

example_data = get_trend_event()
p = example_data.plot.line()

png

We can see that product sales increased sharply from about September 2020. Suppose it was a launch in a new market, and that the business had an initial estimate of the impact in May 2020. The business expected the slope increase to be 6.

Let's use judgyprophet to forecast this series from May 2020. We do this by encoding the initial business estimate as a trend event.

from judgyprophet import JudgyProphet
import pandas as pd
import seaborn as sns

# Create the expected trend events by consulting with the business
trend_events = [
    {'name': "New market entry", 'index': '2020-09-01', 'm0': 6}
]


# Cutoff the data to May 2020
data_may2020 = example_data.loc[:"2020-05-01"]

# Make the forecast with the business estimated level event
# We have no level events, so just provide the empty list.
jp = JudgyProphet()
# Because the event is beyond the actuals, judgyprophet throws a warning.
#    This is just because the Bayesian model at the event has no actuals to learn from.
#    The event is still used in predictions.
jp.fit(
    data=data_may2020,
    level_events=[],
    trend_events=trend_events,
    # Set random seed for reproducibility
    seed=13
)
predictions = jp.predict(horizon=12)
INFO:judgyprophet.judgyprophet:Rescaling onto 0-mean, 1-sd.
WARNING:judgyprophet.judgyprophet:Post-event data for trend event New market entry less than 0 points. Event deactivated in model. Event index: 2020-09-01, training data end index: 2019-06-01 00:00:00
WARNING:judgyprophet.utils:No active trend or level events (i.e. no event indexes overlap with data). The model will just fit a base trend to the data.


Initial log joint probability = -3.4521
    Iter      log prob        ||dx||      ||grad||       alpha      alpha0  # evals  Notes
       3      -2.92768      0.054987   8.11433e-14           1           1        7
Optimization terminated normally:
  Convergence detected: gradient norm is below tolerance

Because we are in May 2020, the forecasting algorithm has nothing to use for learning; so just uses the business estimate. Let's plot the result:

from judgyprophet.tutorials.resources import plot_forecast

plot_forecast(
    actuals=example_data,
    predictions=predictions,
    cutoff="2020-05-01",
    events=trend_events
)
INFO:prophet:Disabling yearly seasonality. Run prophet with yearly_seasonality=True to override this.
INFO:prophet:Disabling weekly seasonality. Run prophet with weekly_seasonality=True to override this.
INFO:prophet:Disabling daily seasonality. Run prophet with daily_seasonality=True to override this.



Initial log joint probability = -17.0121
Iteration  1. Log joint probability =    10.4753. Improved by 27.4875.
Iteration  2. Log joint probability =    12.7533. Improved by 2.27796.
Iteration  3. Log joint probability =    25.4696. Improved by 12.7163.
Iteration  4. Log joint probability =     26.707. Improved by 1.2374.
Iteration  5. Log joint probability =    26.7075. Improved by 0.000514342.
Iteration  6. Log joint probability =    26.7104. Improved by 0.00296558.
Iteration  7. Log joint probability =    26.7122. Improved by 0.00171322.
Iteration  8. Log joint probability =    26.7157. Improved by 0.00351772.
Iteration  9. Log joint probability =    26.7159. Improved by 0.000208268.
Iteration 10. Log joint probability =    26.7159. Improved by 6.64977e-05.
Iteration 11. Log joint probability =     26.716. Improved by 6.89899e-05.
Iteration 12. Log joint probability =     26.716. Improved by 3.06578e-05.
Iteration 13. Log joint probability =     26.716. Improved by 8.91492e-07.
Iteration 14. Log joint probability =     26.716. Improved by 8.71052e-09.

png

We can see judgyprophet is accounting for the increased trend, but the business slightly overestimated the increase in sales due to the product launch.

Let's fast forward to January 2021, the business want to reforecast based on their estimate, and what they've seen so far for the product launch. This is where judgyprophet comes into its own.

Once actuals are observed after the event has taken place, judgyprophet updates its estimate of what the event impact is. Let's look at this in action:

# Cutoff the data to January 2021
data_jan2021 = example_data.loc[:"2021-01-01"]

# Reforecast using the new actuals, not we are at Jan 2021
jp = JudgyProphet()
jp.fit(
    data=data_jan2021,
    level_events=[],
    trend_events=trend_events,
    # Set random seed for reproducibility
    seed=13
)
predictions = jp.predict(horizon=12)
INFO:judgyprophet.judgyprophet:Rescaling onto 0-mean, 1-sd.
INFO:judgyprophet.judgyprophet:Adding trend event New market entry to model. Event index: 2020-09-01, training data start index: 2019-06-01 00:00:00, training data end index: 2021-01-01 00:00:00. Initial gradient: 6. Damping: None.


Initial log joint probability = -309.562
    Iter      log prob        ||dx||      ||grad||       alpha      alpha0  # evals  Notes
      10      -1.64341   2.10244e-05   3.61281e-06           1           1       15
Optimization terminated normally:
  Convergence detected: relative gradient magnitude is below tolerance

Now let's plot the results:

plot_forecast(actuals=example_data, predictions=predictions, cutoff="2021-01-01", events=trend_events)
INFO:prophet:Disabling yearly seasonality. Run prophet with yearly_seasonality=True to override this.
INFO:prophet:Disabling weekly seasonality. Run prophet with weekly_seasonality=True to override this.
INFO:prophet:Disabling daily seasonality. Run prophet with daily_seasonality=True to override this.



Initial log joint probability = -24.5881
Iteration  1. Log joint probability =   -1.06803. Improved by 23.5201.
Iteration  2. Log joint probability =    11.6215. Improved by 12.6895.
Iteration  3. Log joint probability =    36.5271. Improved by 24.9056.
Iteration  4. Log joint probability =    37.3776. Improved by 0.850488.
Iteration  5. Log joint probability =    37.6489. Improved by 0.271259.
Iteration  6. Log joint probability =    37.6547. Improved by 0.00580657.
Iteration  7. Log joint probability =    37.7831. Improved by 0.128419.
Iteration  8. Log joint probability =    37.7884. Improved by 0.00527858.
Iteration  9. Log joint probability =     37.789. Improved by 0.000612124.
Iteration 10. Log joint probability =    37.7891. Improved by 9.93823e-05.
Iteration 11. Log joint probability =    37.7902. Improved by 0.00112416.
Iteration 12. Log joint probability =    37.7902. Improved by 3.17397e-06.
Iteration 13. Log joint probability =    37.7902. Improved by 1.59404e-05.
Iteration 14. Log joint probability =    37.7902. Improved by 5.06854e-07.
Iteration 15. Log joint probability =    37.7902. Improved by 6.87792e-07.
Iteration 16. Log joint probability =    37.7902. Improved by 4.82761e-08.
Iteration 17. Log joint probability =    37.7902. Improved by 2.50385e-07.
Iteration 18. Log joint probability =    37.7902. Improved by 6.60322e-09.

png

In this case, once judgyprophet observes the data post-event, the Bayesian updating starts to realise the business estimate is a bit large, so it reduces it.

This was a simple example to demonstrate judgyprophet. You can add many trend events into a single forecasting horizon, add damping. You can also add level events – changes in the forecasting level; and seasonality see our other tutorials for details about this.

You might also like...
This codebase is the official implementation of Test-Time Classifier Adjustment Module for Model-Agnostic Domain Generalization (NeurIPS2021, Spotlight)

Test-Time Classifier Adjustment Module for Model-Agnostic Domain Generalization This codebase is the official implementation of Test-Time Classifier A

A face dataset generator with out-of-focus blur detection and dynamic interval adjustment.

A face dataset generator with out-of-focus blur detection and dynamic interval adjustment.

Official implementation of
Official implementation of "Accelerating Reinforcement Learning with Learned Skill Priors", Pertsch et al., CoRL 2020

Accelerating Reinforcement Learning with Learned Skill Priors [Project Website] [Paper] Karl Pertsch1, Youngwoon Lee1, Joseph Lim1 1CLVR Lab, Universi

 Geometry-Free View Synthesis: Transformers and no 3D Priors
Geometry-Free View Synthesis: Transformers and no 3D Priors

Geometry-Free View Synthesis: Transformers and no 3D Priors Geometry-Free View Synthesis: Transformers and no 3D Priors Robin Rombach*, Patrick Esser*

 DETReg: Unsupervised Pretraining with Region Priors for Object Detection
DETReg: Unsupervised Pretraining with Region Priors for Object Detection

DETReg: Unsupervised Pretraining with Region Priors for Object Detection Amir Bar, Xin Wang, Vadim Kantorov, Colorado J Reed, Roei Herzig, Gal Chechik

This repository contains the data and code for the paper
This repository contains the data and code for the paper "Diverse Text Generation via Variational Encoder-Decoder Models with Gaussian Process Priors" ([email protected])

GP-VAE This repository provides datasets and code for preprocessing, training and testing models for the paper: Diverse Text Generation via Variationa

Pytorch implementation of Make-A-Scene: Scene-Based Text-to-Image Generation with Human Priors
Pytorch implementation of Make-A-Scene: Scene-Based Text-to-Image Generation with Human Priors

Make-A-Scene - PyTorch Pytorch implementation (inofficial) of Make-A-Scene: Scene-Based Text-to-Image Generation with Human Priors (https://arxiv.org/

Implementation of CVPR'2022:Reconstructing Surfaces for Sparse Point Clouds with On-Surface Priors
Implementation of CVPR'2022:Reconstructing Surfaces for Sparse Point Clouds with On-Surface Priors

Reconstructing Surfaces for Sparse Point Clouds with On-Surface Priors (CVPR 2022) Personal Web Pages | Paper | Project Page This repository contains

Implementation of CVPR'2022:Surface Reconstruction from Point Clouds by Learning Predictive Context Priors
Implementation of CVPR'2022:Surface Reconstruction from Point Clouds by Learning Predictive Context Priors

Surface Reconstruction from Point Clouds by Learning Predictive Context Priors (CVPR 2022) Personal Web Pages | Paper | Project Page This repository c

Comments
  • Bumping jupyter versions in dev dependencies for security patch.

    Bumping jupyter versions in dev dependencies for security patch.

    Patching dev dependencies in light of jupyter security issues:

    • https://github.com/advisories/GHSA-m87f-39q9-6f55
    • https://github.com/advisories/GHSA-p737-p57g-4cpr
    opened by jackcbaker 0
  • Unspecified argument used in judgyprophet.fit()

    Unspecified argument used in judgyprophet.fit()

    The docstring of judgyprophet.fit() states that the dict array fed into 'trend_events' argument only needs three values per dict:

    :param trend_events: A list of dictionaries. Each dict should have the following entries
                - 'index' the start index of the event (i.e. index = i assumes the start of the event
                    is at location actuals[i]). The index should be of the same type as the actuals index.
                - 'm0' the estimated gradient increase following the event
                - 'gamma' (Optional) the damping to use for the trend. This is a float between 0 and 1.
                    It's not recommended to be below 0.8 and must be 0 > gamma <= 1.
                    If gamma is missing from the dict, or gamma = 1, a linear trend is used (i.e. no damping).
    

    But it actually needs 4 to work - the missing one being 'name'.

    The only need for this value currently is logging purposes (lines 1059 and 1085). Perhaps remove this argument from the logging, or add it as a forth key in the dictionary in the docstring?

    opened by Andrew47658 2
  • Correction for the docstring for judgyprophet.fit()

    Correction for the docstring for judgyprophet.fit()

    The docstring for judgyprophet.fit() states:

    :param actuals: A pandas series of the actual timeseries to forecast.
                It is assumed there are no missing data points,
                i.e. x[1] is the observation directly following x[0], etc.
    

    But I believe this argument should be named data, not actuals in the docstring. Thanks!

    opened by Andrew47658 0
Releases(0.1.2)
Owner
AstraZeneca
Data and AI: Unlocking new science insights
AstraZeneca
The source code of "SIDE: Center-based Stereo 3D Detector with Structure-aware Instance Depth Estimation", accepted to WACV 2022.

SIDE: Center-based Stereo 3D Detector with Structure-aware Instance Depth Estimation The source code of our work "SIDE: Center-based Stereo 3D Detecto

10 Dec 18, 2022
ICCV2021 Papers with Code

ICCV2021 Papers with Code

Amusi 1.4k Jan 02, 2023
AlphaNet Improved Training of Supernet with Alpha-Divergence

AlphaNet: Improved Training of Supernet with Alpha-Divergence This repository contains our PyTorch training code, evaluation code and pretrained model

Facebook Research 87 Oct 10, 2022
Neural HMMs are all you need (for high-quality attention-free TTS)

Neural HMMs are all you need (for high-quality attention-free TTS) Shivam Mehta, Éva Székely, Jonas Beskow, and Gustav Eje Henter This is the official

Shivam Mehta 0 Oct 28, 2022
Testability-Aware Low Power Controller Design with Evolutionary Learning, ITC2021

Testability-Aware Low Power Controller Design with Evolutionary Learning This repo contains the source code of Testability-Aware Low Power Controller

Lee Man 1 Dec 26, 2021
Analysing poker data from home games with friends

Poker Game Analysis Analysing poker data from home games with friends. Not a lot of data is collected, so this project is primarily focussed on descri

Stavros Karmaniolos 1 Oct 15, 2022
A trusty face recognition research platform developed by Tencent Youtu Lab

Introduction TFace: A trusty face recognition research platform developed by Tencent Youtu Lab. It provides a high-performance distributed training fr

Tencent 956 Jan 01, 2023
Official implementation of "Learning Proposals for Practical Energy-Based Regression", 2021.

ebms_proposals Official implementation (PyTorch) of the paper: Learning Proposals for Practical Energy-Based Regression, 2021 [arXiv] [project]. Fredr

Fredrik Gustafsson 10 Oct 22, 2022
Contextual Attention Network: Transformer Meets U-Net

Contextual Attention Network: Transformer Meets U-Net Contexual attention network for medical image segmentation with state of the art results on skin

Reza Azad 67 Nov 28, 2022
Minimal PyTorch implementation of Generative Latent Optimization from the paper "Optimizing the Latent Space of Generative Networks"

Minimal PyTorch implementation of Generative Latent Optimization This is a reimplementation of the paper Piotr Bojanowski, Armand Joulin, David Lopez-

Thomas Neumann 117 Nov 27, 2022
Pytorch implementation of Depth-conditioned Dynamic Message Propagation forMonocular 3D Object Detection

DDMP-3D Pytorch implementation of Depth-conditioned Dynamic Message Propagation forMonocular 3D Object Detection, a paper on CVPR2021. Instroduction T

Li Wang 32 Nov 09, 2022
A Python-based development platform for automated trading systems - from backtesting to optimisation to livetrading.

AutoTrader AutoTrader is Python-based platform intended to help in the development, optimisation and deployment of automated trading systems. From sim

Kieran Mackle 485 Jan 09, 2023
This repo is to be freely used by ML devs to check the GAN performances without coding from scratch.

GANs for Fun Created because I can! GOAL The goal of this repo is to be freely used by ML devs to check the GAN performances without coding from scrat

Sagnik Roy 13 Jan 26, 2022
A PyTorch implementation of "CoAtNet: Marrying Convolution and Attention for All Data Sizes".

CoAtNet Overview This is a PyTorch implementation of CoAtNet specified in "CoAtNet: Marrying Convolution and Attention for All Data Sizes", arXiv 2021

Justin Wu 268 Jan 07, 2023
Supercharging Imbalanced Data Learning WithCausal Representation Transfer

ECRT: Energy-based Causal Representation Transfer Code for Supercharging Imbalanced Data Learning With Energy-basedContrastive Representation Transfer

Zidi Xiu 11 May 02, 2022
The official repository for Deep Image Matting with Flexible Guidance Input

FGI-Matting The official repository for Deep Image Matting with Flexible Guidance Input. Paper: https://arxiv.org/abs/2110.10898 Requirements easydict

Hang Cheng 51 Nov 10, 2022
Fibonacci Method Gradient Descent

An implementation of the Fibonacci method for gradient descent, featuring a TKinter GUI for inputting the function / parameters to be examined and a matplotlib plot of the function and results.

Emma 1 Jan 28, 2022
Run PowerShell command without invoking powershell.exe

PowerLessShell PowerLessShell rely on MSBuild.exe to remotely execute PowerShell scripts and commands without spawning powershell.exe. You can also ex

Mr.Un1k0d3r 1.2k Jan 03, 2023
Cortex-compatible model server for Python and TensorFlow

Nucleus model server Nucleus is a model server for TensorFlow and generic Python models. It is compatible with Cortex clusters, Kubernetes clusters, a

Cortex Labs 14 Nov 27, 2022
This is the official PyTorch implementation of our paper: "Artistic Style Transfer with Internal-external Learning and Contrastive Learning".

Artistic Style Transfer with Internal-external Learning and Contrastive Learning This is the official PyTorch implementation of our paper: "Artistic S

51 Dec 20, 2022