More than a hundred strange attractors

Related tags

Deep Learningdysts
Overview

dysts

Analyze more than a hundred chaotic systems.

An embedding of all chaotic systems in the collection

Basic Usage

Import a model and run a simulation with default initial conditions and parameter values

from dysts.flows import Lorenz

model = Lorenz()
sol = model.make_trajectory(1000)
# plt.plot(sol[:, 0], sol[:, 1])

Modify a model's parameter values and re-integrate

model = Lorenz()
model.gamma = 1
model.ic = [0, 0, 0.2]
sol = model.make_trajectory(1000)
# plt.plot(sol[:, 0], sol[:, 1])

Load a precomputed trajectory for the model

eq = Lorenz()
sol = eq.load_trajectory(subsets="test", noise=False, granularity="fine")
# plt.plot(sol[:, 0], sol[:, 1])

Integrate new trajectories from all 131 chaotic systems with a custom granularity

from dysts.base import make_trajectory_ensemble

all_out = make_trajectory_ensemble(100, resample=True, pts_per_period=75)

Load a precomputed collection of time series from all 131 chaotic systems

from dysts.datasets import load_dataset

data = load_dataset(subsets="train", data_format="numpy", standardize=True)

Additional functionality and examples can be found in the demonstrations notebook.. The full API documentation can be found here.

Reference

For additional details, please see the preprint. If using this code for published work, please consider citing the paper.

William Gilpin. "Chaos as an interpretable benchmark for forecasting and data-driven modelling" Advances in Neural Information Processing Systems (NeurIPS) 2021 https://arxiv.org/abs/2110.05266

Installation

Install from PyPI

pip install dysts

To obtain the latest version, including new features and bug fixes, download and install the project repository directly from GitHub

git clone https://github.com/williamgilpin/dysts
cd dysts
pip install -I . 

Test that everything is working

python -m unittest

Alternatively, to use this as a regular package without downloading the full repository, install directly from GitHub

pip install git+git://github.com/williamgilpin/dysts

The key dependencies are

  • Python 3+
  • numpy
  • scipy
  • pandas
  • sdeint (optional, but required for stochastic dynamics)
  • numba (optional, but speeds up generation of trajectories)

These additional optional dependencies are needed to reproduce some portions of this repository, such as benchmarking experiments and estimation of invariant properties of each dynamical system:

  • nolds (used for calculating the correlation dimension)
  • darts (used for forecasting benchmarks)
  • sktime (used for classification benchmarks)
  • tsfresh (used for statistical quantity extraction)
  • pytorch (used for neural network benchmarks)

Contributing

New systems. If you know of any systems should be included, please feel free to submit an issue or pull request. The biggest bottleneck when adding new models is a lack of known parameter values and initial conditions, and so please provide a reference or code that contains all parameter values necessary to reproduce the claimed dynamics. Because there are an infinite number of chaotic systems, we currently are only including systems that have appeared in published work.

Development and Maintainence. We are very grateful for any suggestions or contributions. See the to-do list below for some of the ongoing work.

Benchmarks

The benchmarks reported in our preprint can be found in benchmarks. An overview of the contents of the directory can be found in BENCHMARKS.md, while individual task areas are summarized in corresponding Jupyter Notebooks within the top level of the directory.

Contents

  • Code to generate benchmark forecasting and training experiments are included in benchmarks
  • Pre-computed time series with training and test partitions are included in data
  • The raw definitions metadata for all chaotic systems are included in the database file chaotic_attractors. The Python implementations of differential equations can be found in the flows module

Implementation Notes

  • Currently there are 131 continuous time models, including several delay diffential equations. There is also a separate module with 10 discrete maps, which is currently being expanded.
  • The right hand side of each dynamical equation is compiled using numba, wherever possible. Ensembles of trajectories are vectorized where needed.
  • Attractor names, default parameter values, references, and other metadata are stored in parseable JSON database files. Parameter values are based on standard or published values, and default initial conditions were generated by running each model until the moments of the autocorrelation function all become stationary.
  • The default integration step is stored in each continuous-time model's dt field. This integration timestep was chosen based on the highest significant frequency observed in the power spectrum, with significance being determined relative to random phase surrogates. The period field contains the timescale associated with the dominant frequency in each system's power spectrum. When using the model.make_trajectory() method with the optional setting resample=True, integration is performed at the default dt. The integrated trajectory is then resampled based on the period. The resulting trajectories will have have consistant dominant timescales across models, despite having different integration timesteps.

Acknowledgements

  • Two existing collections of named systems can be found on the webpages of Jürgen Meier and J. C. Sprott. The current version of dysts contains all systems from both collections.
  • Several of the analysis routines (such as calculation of the correlation dimension) use the library nolds. If re-using the fractal dimension code that depends on nolds, please be sure to credit that library and heed its license. The Lyapunov exponent calculation is based on the QR factorization approach used by Wolf et al 1985 and Eckmann et al 1986, with implementation details adapted from conventions in the Julia library DynamicalSystems.jl

Ethics & Reporting

Dataset datasheets and metadata are reported using the dataset documentation guidelines described in Gebru et al 2018; please see our preprint for a full dataset datasheet and other information. We note that all datasets included here are mathematical in nature, and do not contain human or clinical observations. If any users become aware of unintended harms that may arise due to the use of this data, we encourage reporting them by submitting an issue on this repository.

Development to-do list

A partial list of potential improvements in future versions

  • Speed up the delay equation implementation
    • We need to roll our own implementation of DDE23 in the utils module.
  • Improve calculations of Lyapunov exponents for delay systems
  • Implement multivariate multiscale entropy and re-calculate for all attractors
  • Add a method for parallel integrating multiple systems at once, based on a list of names and a set of shared settings
    • Can use multiprocessing for a few systems, but greater speedups might be possible by compiling all right hand sides into a single function acting on a large vector.
    • Can also use this same utility to integrate multiple initial conditions for the same model
  • Add a separate jacobian database file, and add an attribute that can be used to check if an analytical one exists. This will speed up numerical integration, as well as potentially aid in calculating Lyapunov exponents.
  • Align the initial phases, potentially by picking default starting initial conditions that lie on the attractor, but which are as close as possible to the origin
  • Expand and finalize the discrete dysts.maps module
    • Maps are deterministic but not differentiable, and so not all analysis methods will work on them. Will probably need a decorator to declare whether utilities work on flows, maps, or both
  • Switch stochastic integration to a newer package, like torchsde or sdepy
Owner
William Gilpin
Physics researcher at Harvard. Soon @GilpinLab at UT Austin
William Gilpin
BEAS: Blockchain Enabled Asynchronous & Secure Federated Machine Learning

BEAS Blockchain Enabled Asynchronous and Secure Federated Machine Learning Default Network Configuration: The default application uses the HyperLedger

Harpreet Virk 11 Nov 20, 2022
A high-performance anchor-free YOLO. Exceeding yolov3~v5 with ONNX, TensorRT, NCNN, and Openvino supported.

YOLOX is an anchor-free version of YOLO, with a simpler design but better performance! It aims to bridge the gap between research and industrial communities. For more details, please refer to our rep

7.7k Jan 06, 2023
Let's create a tool to convert Thailand budget from PDF to CSV.

thailand-budget-pdf2csv Let's create a tool to convert Thailand Government Budgeting from PDF to CSV! รวมพลัง Dev แปลงงบ จาก PDF สู่ Machine-readable

Kao.Geek 88 Dec 19, 2022
Unofficial TensorFlow implementation of the Keyword Spotting Transformer model

Keyword Spotting Transformer This is the unofficial TensorFlow implementation of the Keyword Spotting Transformer model. This model is used to train o

Intelligent Machines Limited 8 May 11, 2022
Official PyTorch Implementation of Embedding Transfer with Label Relaxation for Improved Metric Learning, CVPR 2021

Embedding Transfer with Label Relaxation for Improved Metric Learning Official PyTorch implementation of CVPR 2021 paper Embedding Transfer with Label

Sungyeon Kim 37 Dec 06, 2022
Source code for CVPR2022 paper "Abandoning the Bayer-Filter to See in the Dark"

Abandoning the Bayer-Filter to See in the Dark (CVPR 2022) Paper: https://arxiv.org/abs/2203.04042 (Arxiv version) This code includes the training and

74 Dec 15, 2022
Layered Neural Atlases for Consistent Video Editing

Layered Neural Atlases for Consistent Video Editing Project Page | Paper This repository contains an implementation for the SIGGRAPH Asia 2021 paper L

Yoni Kasten 353 Dec 27, 2022
DenseCLIP: Language-Guided Dense Prediction with Context-Aware Prompting

DenseCLIP: Language-Guided Dense Prediction with Context-Aware Prompting Created by Yongming Rao*, Wenliang Zhao*, Guangyi Chen, Yansong Tang, Zheng Z

Yongming Rao 322 Dec 31, 2022
🥇Samsung AI Challenge 2021 1등 솔루션입니다🥇

MoT - Molecular Transformer Large-scale Pretraining for Molecular Property Prediction Samsung AI Challenge for Scientific Discovery This repository is

Jungwoo Park 44 Dec 03, 2022
deep-prae

Deep Probabilistic Accelerated Evaluation (Deep-PrAE) Our work presents an efficient rare event simulation methodology for black box autonomy using Im

Safe AI Lab 4 Apr 17, 2021
Official implement of Paper:A deeply supervised image fusion network for change detection in high resolution bi-temporal remote sening images

A deeply supervised image fusion network for change detection in high resolution bi-temporal remote sensing images 深度监督影像融合网络DSIFN用于高分辨率双时相遥感影像变化检测 Of

Chenxiao Zhang 135 Dec 19, 2022
Code for "Training Neural Networks with Fixed Sparse Masks" (NeurIPS 2021).

Code for "Training Neural Networks with Fixed Sparse Masks" (NeurIPS 2021).

Varun Nair 37 Dec 30, 2022
A certifiable defense against adversarial examples by training neural networks to be provably robust

DiffAI v3 DiffAI is a system for training neural networks to be provably robust and for proving that they are robust. The system was developed for the

SRI Lab, ETH Zurich 202 Dec 13, 2022
CARLA: A Python Library to Benchmark Algorithmic Recourse and Counterfactual Explanation Algorithms

CARLA - Counterfactual And Recourse Library CARLA is a python library to benchmark counterfactual explanation and recourse models. It comes out-of-the

Carla Recourse 200 Dec 28, 2022
High-fidelity 3D Model Compression based on Key Spheres

High-fidelity 3D Model Compression based on Key Spheres This repository contains the implementation of the paper: Yuanzhan Li, Yuqi Liu, Yujie Lu, Siy

5 Oct 11, 2022
Deep Learning to Improve Breast Cancer Detection on Screening Mammography

Shield: This work is licensed under a Creative Commons Attribution-NonCommercial-ShareAlike 4.0 International License. Deep Learning to Improve Breast

Li Shen 305 Jan 03, 2023
A PyTorch implementation of Radio Transformer Networks from the paper "An Introduction to Deep Learning for the Physical Layer".

An Introduction to Deep Learning for the Physical Layer An usable PyTorch implementation of the noisy autoencoder infrastructure in the paper "An Intr

Gram.AI 120 Nov 21, 2022
PocketNet: Extreme Lightweight Face Recognition Network using Neural Architecture Search and Multi-Step Knowledge Distillation

PocketNet This is the official repository of the paper: PocketNet: Extreme Lightweight Face Recognition Network using Neural Architecture Search and M

Fadi Boutros 40 Dec 22, 2022
Pytorch implementation of our paper LIMUSE: LIGHTWEIGHT MULTI-MODAL SPEAKER EXTRACTION.

LiMuSE Overview Pytorch implementation of our paper LIMUSE: LIGHTWEIGHT MULTI-MODAL SPEAKER EXTRACTION. LiMuSE explores group communication on a multi

Auditory Model and Cognitive Computing Lab 17 Oct 26, 2022
StocksMA is a package to facilitate access to financial and economic data of Moroccan stocks.

Creating easier access to the Moroccan stock market data What is StocksMA ? StocksMA is a package to facilitate access to financial and economic data

Salah Eddine LABIAD 28 Jan 04, 2023