JumpDiff: Non-parametric estimator for Jump-diffusion processes for Python

Overview

PyPI - License PyPI PyPI - Python Version Build Status codecov Documentation Status

jumpdiff

jumpdiff is a python library with non-parametric Nadaraya─Watson estimators to extract the parameters of jump-diffusion processes. With jumpdiff one can extract the parameters of a jump-diffusion process from one-dimensional timeseries, employing both a kernel-density estimation method combined with a set on second-order corrections for a precise retrieval of the parameters for short timeseries.

Installation

To install jumpdiff, run

   pip install jumpdiff

Then on your favourite editor just use

   import jumpdiff as jd

Dependencies

The library parameter estimation depends on numpy and scipy solely. The mathematical formulae depend on sympy. It stems from kramersmoyal project, but functions independently from it3.

Documentation

You can find the documentation here.

Jump-diffusion processes

The theory

Jump-diffusion processes1, as the name suggest, are a mixed type of stochastic processes with a diffusive and a jump term. One form of these processes which is mathematically traceable is given by the Stochastic Differential Equation

which has 4 main elements: a drift term , a diffusion term , and jump amplitude term , which is given by a Gaussian distribution, and finally a jump rate . You can find a good review on this topic in Ref. 2.

Integrating a jump-diffusion process

Let us use the functions in jumpdiff to generate a jump-difussion process, and subsequently retrieve the parameters. This is a good way to understand the usage of the integrator and the non-parametric retrieval of the parameters.

First we need to load our library. We will call it jd

import jumpdiff as jd

Let us thus define a jump-diffusion process and use jd_process to integrate it. Do notice here that we need the drift and diffusion as functions.

# integration time and time sampling
t_final = 10000
delta_t = 0.001

# A drift function
def a(x):
    return -0.5*x

# and a (constant) diffusion term
def b(x):
    return 0.75

# Now define a jump amplitude and rate
xi = 2.5
lamb = 1.75

# and simply call the integration function
X = jd.jd_process(t_final, delta_t, a=a, b=b, xi=xi, lamb=lamb)

This will generate a jump diffusion process X of length int(10000/0.001) with the given parameters.

Using jumpdiff to retrieve the parameters

Moments and Kramers─Moyal coefficients

Take the timeseries X and use the function moments to retrieve the conditional moments of the process. For now let us focus on the shortest time lag, so we can best approximate the Kramers─Moyal coefficients. For this case we can simply employ

edges, moments = jd.moments(timeseries = X)

In the array edges are the limits of our space, and in our array moments are recorded all 6 powers/order of our conditional moments. Let us take a look at these before we proceed, to get acquainted with them.

We can plot the first moment with any conventional plotter, so lets use here plotly from matplotlib

import matplotlib.plotly as plt

# we want the first power, so we need 'moments[1,...]'
plt.plot(edges, moments[1,...])

The first moment here (i.e., the first Kramers─Moyal coefficient) is given solely by the drift term that we have selected -0.5*x

And the second moment (i.e., the second Kramers─Moyal coefficient) is a mixture of both the contributions of the diffusive term and the jump terms and .

You have this stored in moments[2,...].

Retrieving the jump-related terms

Naturally one of the most pertinent questions when addressing jump-diffusion processes is the possibility of recovering these same parameters from data. For the given jump-diffusion process we can use the jump_amplitude and jump_rate functions to non-parametrically estimate the jump amplitude and jump rate terms.

After having the moments in hand, all we need is

# first estimate the jump amplitude
xi_est = jd.jump_amplitude(moments = moments)

# and now estimated the jump rate
lamb_est = jd.jump_rate(moments = moments)

which resulted in our case in (xi_est) ξ = 2.43 ± 0.17 and (lamb_est) λ = 1.744 * delta_t (don't forget to divide lamb_est by delta_t)!

Other functions and options

Include in this package is also the Milstein scheme of integration, particularly important when the diffusion term has some spacial x dependence. moments can actually calculate the conditional moments for different lags, using the parameter lag.

In formulae the set of formulas needed to calculate the second order corrections are given (in sympy).

Contributions

We welcome reviews and ideas from everyone. If you want to share your ideas, upgrades, doubts, or simply report a bug, open an issue here on GitHub, or contact us directly. If you need help with the code, the theory, or the implementation, drop us an email. We abide to a Conduct of Fairness.

Changelog

  • Version 0.4 - Designing a set of self-consistency checks, the documentation, examples, and a trial code. Code at PyPi.
  • Version 0.3 - Designing a straightforward procedure to retrieve the jump amplitude and jump rate functions, alongside with a easy sympy displaying the correction.
  • Version 0.2 - Introducing the second-order corrections to the moments
  • Version 0.1 - Design an implementation of the moments functions, generalising kramersmoyal km.

Literature and Support

History

This project was started in 2017 at the neurophysik by Leonardo Rydin Gorjão, Jan Heysel, Klaus Lehnertz, and M. Reza Rahimi Tabar, and separately by Pedro G. Lind, at the Department of Computer Science, Oslo Metropolitan University. From 2019 to 2021, Pedro G. Lind, Leonardo Rydin Gorjão, and Dirk Witthaut developed a set of corrections and an implementation for python, presented here.

Funding

Helmholtz Association Initiative Energy System 2050 - A Contribution of the Research Field Energy and the grant No. VH-NG-1025 and STORM - Stochastics for Time-Space Risk Models project of the Research Council of Norway (RCN) No. 274410.


Bibliography

1 Tabar, M. R. R. Analysis and Data-Based Reconstruction of Complex Nonlinear Dynamical Systems. Springer, International Publishing (2019), Chapter Stochastic Processes with Jumps and Non-vanishing Higher-Order Kramers–Moyal Coefficients.

2 Friedrich, R., Peinke, J., Sahimi, M., Tabar, M. R. R. Approaching complexity by stochastic methods: From biological systems to turbulence, Physics Reports 506, 87–162 (2011).

3 Rydin Gorjão, L., Meirinhos, F. kramersmoyal: Kramers–Moyal coefficients for stochastic processes. Journal of Open Source Software, 4(44) (2019).

Extended Literature

You can find further reading on SDE, non-parametric estimatons, and the general principles of the Fokker–Planck equation, Kramers–Moyal expansion, and related topics in the classic (physics) books

  • Risken, H. The Fokker–Planck equation. Springer, Berlin, Heidelberg (1989).
  • Gardiner, C.W. Handbook of Stochastic Methods. Springer, Berlin (1985).

And an extensive review on the subject here

You might also like...
This python-based package offers a way of creating a parametric OpenMC plasma source from plasma parameters.
This python-based package offers a way of creating a parametric OpenMC plasma source from plasma parameters.

openmc-plasma-source This python-based package offers a way of creating a parametric OpenMC plasma source from plasma parameters. The OpenMC sources a

A Robust Non-IoU Alternative to Non-Maxima Suppression in Object Detection
A Robust Non-IoU Alternative to Non-Maxima Suppression in Object Detection

Confluence: A Robust Non-IoU Alternative to Non-Maxima Suppression in Object Detection 1. 介绍 用以替代 NMS,在所有 bbox 中挑选出最优的集合。 NMS 仅考虑了 bbox 的得分,然后根据 IOU 来

A parametric soroban written with CADQuery.
A parametric soroban written with CADQuery.

A parametric soroban written in CADQuery The purpose of this project is to demonstrate how "code CAD" can be intuitive to learn. See soroban.py for a

The personal repository of the work: *DanceNet3D: Music Based Dance Generation with Parametric Motion Transformer*.
The personal repository of the work: *DanceNet3D: Music Based Dance Generation with Parametric Motion Transformer*.

DanceNet3D The personal repository of the work: DanceNet3D: Music Based Dance Generation with Parametric Motion Transformer. Dataset and Results Pleas

Official implementation of NPMs: Neural Parametric Models for 3D Deformable Shapes - ICCV 2021
Official implementation of NPMs: Neural Parametric Models for 3D Deformable Shapes - ICCV 2021

NPMs: Neural Parametric Models Project Page | Paper | ArXiv | Video NPMs: Neural Parametric Models for 3D Deformable Shapes Pablo Palafox, Aljaz Bozic

Parametric Contrastive Learning (ICCV2021)

Parametric-Contrastive-Learning This repository contains the implementation code for ICCV2021 paper: Parametric Contrastive Learning (https://arxiv.or

The official implementation of the research paper
The official implementation of the research paper "DAG Amendment for Inverse Control of Parametric Shapes"

DAG Amendment for Inverse Control of Parametric Shapes This repository is the official Blender implementation of the paper "DAG Amendment for Inverse

This repository contains a pytorch implementation of
This repository contains a pytorch implementation of "HeadNeRF: A Real-time NeRF-based Parametric Head Model (CVPR 2022)".

HeadNeRF: A Real-time NeRF-based Parametric Head Model This repository contains a pytorch implementation of "HeadNeRF: A Real-time NeRF-based Parametr

A Python implementation of global optimization with gaussian processes.
A Python implementation of global optimization with gaussian processes.

Bayesian Optimization Pure Python implementation of bayesian global optimization with gaussian processes. PyPI (pip): $ pip install bayesian-optimizat

Comments
  • Example notation

    Example notation

    Hi @LRydin, thank you for this great package. I have a question regarding the notation in the definition of the jump term. The parameter xi is referred to as the jump amplitude, which is however normally distributed with variance sigma_xi. Is the xi parameter as set in the example actually the standard deviation of the jump amplitude?

    Furthermore, is there a way to set the initial condition, e.g., X_0 = 0?

    Thanks again! Kyriakos

    opened by kyrgeorgiou 2
  • negative lambda and others

    negative lambda and others

    This is a great package. And I have some questions. Recently I come across the KM perspective to estimate levy triplet by virtue of the book of Mr Tabar published in 2019. Statisticians, especially French scholars, have done a lot on it. However, real time series are traces of dynamic processes ruled by some unknown laws. Methods that lack the support of some natural law governance yet only focus on data may not be the right road. And the KM perspective shows me a direction. I tried your package last night with some real time series, but many of them are negative lambda. Besides, I want to use the estimated lambda and xi to generate surrogate process to compare with the original time series (I also want to get drift and diffusion from the moments but moments are not scalar,thus I calculated from the original), but the generated X has many nan. Could you give some guidance? And the KM perspective is a nonparametric method, which may have some similarities with the statistical methods worked by, e.g. Fabienne Comte or Jean Jacod. Could you show some details on them? Finally neural network is nonparametric method to approximate the unknown distribution by stacked neural layers, do you have some plan to try them?

    Many thanks

    opened by funfwo 1
Releases(v0.4)
  • v0.4(Jul 29, 2021)

    Jump-Diffusion processes non-parametric estimation methods. From numerical simulations to estimations. This release includes two integrations schemes, moment calculation, the Q-ratio to examine jumpiness of timeseries, and parameter estimation for univariate Jump-Diffusion processes.

    Source code(tar.gz)
    Source code(zip)
Owner
Rydin
I had cereals for breakfast
Rydin
ICCV2021 Papers with Code

ICCV2021 Papers with Code

Amusi 1.4k Jan 02, 2023
A library for using chemistry in your applications

Chemistry in python Resources Used The following items are not made by me! Click the words to go to the original source Periodic Tab Json - Used in -

Tech Penguin 28 Dec 17, 2021
Semi-Supervised Learning for Fine-Grained Classification

Semi-Supervised Learning for Fine-Grained Classification This repo contains the code of: A Realistic Evaluation of Semi-Supervised Learning for Fine-G

25 Nov 08, 2022
"Projelerle Yapay Zeka Ve Bilgisayarlı Görü" Kitabımın projeleri

"Projelerle Yapay Zeka Ve Bilgisayarlı Görü" Kitabımın projeleri Bu Github Reposundaki tüm projeler; kaleme almış olduğum "Projelerle Yapay Zekâ ve Bi

Ümit Aksoylu 4 Aug 03, 2022
A library that allows for inference on probabilistic models

Bean Machine Overview Bean Machine is a probabilistic programming language for inference over statistical models written in the Python language using

Meta Research 234 Dec 29, 2022
NAS-Bench-x11 and the Power of Learning Curves

NAS-Bench-x11 NAS-Bench-x11 and the Power of Learning Curves Shen Yan, Colin White, Yash Savani, Frank Hutter. NeurIPS 2021. Surrogate NAS benchmarks

AutoML-Freiburg-Hannover 13 Nov 18, 2022
Object-aware Contrastive Learning for Debiased Scene Representation

Object-aware Contrastive Learning Official PyTorch implementation of "Object-aware Contrastive Learning for Debiased Scene Representation" by Sangwoo

43 Dec 14, 2022
RARA: Zero-shot Sim2Real Visual Navigation with Following Foreground Cues

RARA: Zero-shot Sim2Real Visual Navigation with Following Foreground Cues FGBG (foreground-background) pytorch package for defining and training model

Klaas Kelchtermans 1 Jun 02, 2022
GenGNN: A Generic FPGA Framework for Graph Neural Network Acceleration

GenGNN: A Generic FPGA Framework for Graph Neural Network Acceleration Stefan Abi-Karam*, Yuqi He*, Rishov Sarkar*, Lakshmi Sathidevi, Zihang Qiao, Co

Sharc-Lab 19 Dec 15, 2022
使用深度学习框架提取视频硬字幕;docker容器免安装深度学习库,使用本地api接口使得界面和后端识别分离;

extract-video-subtittle 使用深度学习框架提取视频硬字幕; 本地识别无需联网; CPU识别速度可观; 容器提供API接口; 运行环境 本项目运行环境非常好搭建,我做好了docker容器免安装各种深度学习包; 提供windows界面操作; 容器为CPU版本; 视频演示 https

歌者 16 Aug 06, 2022
Code for the paper: Adversarial Training Against Location-Optimized Adversarial Patches. ECCV-W 2020.

Adversarial Training Against Location-Optimized Adversarial Patches arXiv | Paper | Code | Video | Slides Code for the paper: Sukrut Rao, David Stutz,

Sukrut Rao 32 Dec 13, 2022
Vpw analyzer - A visual J1850 VPW analyzer written in Python

VPW Analyzer A visual J1850 VPW analyzer written in Python Requires Tkinter, Pan

7 May 01, 2022
StocksMA is a package to facilitate access to financial and economic data of Moroccan stocks.

Creating easier access to the Moroccan stock market data What is StocksMA ? StocksMA is a package to facilitate access to financial and economic data

Salah Eddine LABIAD 28 Jan 04, 2023
CLIPort: What and Where Pathways for Robotic Manipulation

CLIPort CLIPort: What and Where Pathways for Robotic Manipulation Mohit Shridhar, Lucas Manuelli, Dieter Fox CoRL 2021 CLIPort is an end-to-end imitat

246 Dec 11, 2022
Mixup for Supervision, Semi- and Self-Supervision Learning Toolbox and Benchmark

OpenSelfSup News Downstream tasks now support more methods(Mask RCNN-FPN, RetinaNet, Keypoints RCNN) and more datasets(Cityscapes). 'GaussianBlur' is

AI Lab, Westlake University 332 Jan 03, 2023
Official repository of "BasicVSR++: Improving Video Super-Resolution with Enhanced Propagation and Alignment"

BasicVSR_PlusPlus (CVPR 2022) [Paper] [Project Page] [Code] This is the official repository for BasicVSR++. Please feel free to raise issue related to

Kelvin C.K. Chan 227 Jan 01, 2023
Syllabus del curso IIC2115 - Programación como Herramienta para la Ingeniería 2022/I

IIC2115 - Programación como Herramienta para la Ingeniería Videos y tutoriales Tutorial CMD Tutorial Instalación Python y Jupyter Tutorial de git-GitH

21 Nov 09, 2022
[EMNLP 2021] Distantly-Supervised Named Entity Recognition with Noise-Robust Learning and Language Model Augmented Self-Training

RoSTER The source code used for Distantly-Supervised Named Entity Recognition with Noise-Robust Learning and Language Model Augmented Self-Training, p

Yu Meng 60 Dec 30, 2022
PyTorch implementation of HDN(Homography Decomposition Networks) for planar object tracking

Homography Decomposition Networks for Planar Object Tracking This project is the offical PyTorch implementation of HDN(Homography Decomposition Networ

CaptainHook 48 Dec 15, 2022
Best Practices on Recommendation Systems

Recommenders What's New (February 4, 2021) We have a new relase Recommenders 2021.2! It comes with lots of bug fixes, optimizations and 3 new algorith

Microsoft 14.8k Jan 03, 2023