Projecting interval uncertainty through the discrete Fourier transform

Overview

Projecting interval uncertainty through the discrete Fourier transform

This repository provides a method that can propagate interval uncertainty through the discrete Fourier transform while yielding the exact bounds on the Fourier amplitude and Power Spectral Density function. The algorithm applies to real sequences of intervals. The method allows technical analysts to project interval uncertainty present in the time signals to their Fourier amplitude without making assumptions about the error distribution at each time step. Thus, it is possible to calculate and analyse system responses in the frequency domain without conducting extensive Monte Carlo simulations in the time domain. The applicability of this method in practice is demonstrated by a technical application.

Disclaimer: This code was developed for illustration purposes and for proof-of-concept. Thus this code is not optimized for large-scale applications. An optimized version of the code is currently under development.

References

De Angelis, M.; Behrendt, M.; Comerford, L.; Zhang, Y.; Beer, M. (2021): Forward interval propagation through the discrete Fourier transform, The 9th international workshop on Reliable Engineering Computing, arXiv:2012.09778.

Installation

Clone the git repository on your machine, cd to the repository, open a Python3 interpreter and import the interval Fourier transform ans other useful packages

from fourier.transform import transform as intervalDFT
from fourier.application import application as app
from fourier.number import number as int_num
import numpy
from numpy import (arange, cos, exp, linspace, mean, pi,  sin, zeros) 
from matplotlib import pyplot, cm

Signal generation and interval DFT

At first time and frequency parameters and an analytical PSD function are needed to model a stochastic process.

Define parameters

wu = 2.2975 # upper cut-off frequency
T = 350 # total time length

dt = 2*pi /(2*wu) # timestep size
dw = 2*pi / T # frequency step size

t = numpy.arange(0,T,dt) # time vector
w = numpy.arange(0,wu,dw) # frequency vector

JONSWAP power spectrum

The JONSWAP power spectrum is utilised to generate stochastic processes. The required parameters are:

alpha = 0.0081 # spectral energy parameter
w_p = 0.7 # peak frequency
gamma = 3.3 # peak enhancement factor
sigma1 = 0.07 # spectral width parameter for w <= w_p
sigma2 = 0.09 # spectral width parameter for w > w_p
spectrum = app.jonswap_spectrum(w,alpha,w_p,gamma,sigma1,sigma2)

Plot the JONSWAP power spectrum

ax = app.plot_line(w,spectrum,figsize=(18,6),xlabel=r'#$x$',ylabel='$x$',color=None,lw=1,title='JONSWAP power spectrum',ax=None,label=None)
ax.set_xlabel('Frequency [rad/s]',fontsize=20)
ax.set_ylabel('Power Spectral Density [m$^2$s]',fontsize=20)

fig

Generate time signal and intervalize it

To generate a stochastic process the spectral representation method is utilised. This signal is then intervalized with interval uncertainty ±0.1. Both signals are plotted.

sea_waves = app.stochastic_process(spectrum,w,t) 
pm = 0.1
sea_waves_interval = intervalDFT.intervalize(sea_waves, pm)

ax = app.plot_line(t,sea_waves,figsize=(18,6),xlabel='Time [s]',ylabel='Wave height [m]',color='rebeccapurple',lw=1,title='Signal from stationary power spectrum',ax=None,label=None)
sea_waves_interval.plot(xlabel='Time [s]',ylabel='Wave height [m]',title=r'Signal with $\pm$ '+str(pm)+' information gaps (intervals)')

fig fig

Compute the Fourier transforms

Compute the Fourier transform of the crisp signal and the interval Fourier transform for the interval signal with the selective method and the interval method. Also compute the periodogram of respective (bounded) Fourier amplitudes.

FA = intervalDFT.Fourier_amplitude(sea_waves)
BI,BS = intervalDFT.compute_amplitude_bounds(sea_waves_interval)
BI.insert(0,int_num.Interval(0,0))
BS.insert(0,int_num.Interval(0,0))

FA = app.periodogram(FA, t, dt)
BI = app.periodogram(BI, t, dt)
BS = app.periodogram(BS, t, dt)

Plot the interval Fourier transform

The amplitude of the crisp signal and both bounded Fourier amplituted are plotted.

ax = app.plot_line(w,FA,figsize=(18,6),xlabel=r'#$x$',ylabel=r'$x$',color=None,lw=1,title=None,ax=None,label='Interval uncertainty: $\pm$ '+str(pm)+'')
app.plot_bounds(x=w,bounds=BI,color='cornflowerblue',alpha=0.4,ax=ax)
app.plot_bounds(x=w,bounds=BS,color='orangered',alpha=0.6,ax=ax)
ax.set_xlabel('Frequency [rad/s]',fontsize=20)
ax.set_ylabel('Power Spectral Density [m$^2$s]',fontsize=20)
ax.tick_params(direction='out', length=6, width=2, labelsize=14)

fig

Application to a SDOF system

The system under investigation is a offshore wind turbine simplified to a SDOF system. The parameters are set to

R = 3 # outer radius
r = 2.8 # inner radius
h_pile = 60 # height
rho_steel = 7800 # density of steel
c = 1e5 # stiffness
k = 1e6 # damping coefficient

Get the natural frequency w0 and the damping ratio xi

w0,xi = app.wind_turbine(R,r,h_pile,rho_steel,c,k)

The response can be obtained by pushing the (intervalised) signal through the frequency response function

freq_response_precise = app.frequency_response(w,FA,w0,xi)
freq_response_BI_low,freq_response_BI_high = app.frequency_response_interval(w,BI,w0,xi)
freq_response_BS_low,freq_response_BS_high = app.frequency_response_interval(w,BS,w0,xi)

Those responses can be plotted

ax = app.plot_line(w,freq_response_precise,figsize=(18,6),xlabel=r'#$x$',ylabel=r'$x$',color=None,lw=1,title=None,ax=None,label=None)
ax.fill_between(x=w,y1=freq_response_BI_low,y2=freq_response_BI_high, alpha=0.4, label='Interval', edgecolor='blue', lw=2, color='cornflowerblue')
ax.fill_between(x=w,y1=freq_response_BS_low,y2=freq_response_BS_high, alpha=0.6, label='Selective', edgecolor='red', lw=2, color='orangered')

ax.set_xlabel('Frequency [rad/s]',fontsize=20)
ax.set_ylabel('Power Spectral Density [m$^2$s]',fontsize=20)
ax.set_title(r'Interval uncertainty: $\pm$ '+str(pm)+'', fontsize=20)

ax.tick_params(direction='out', length=6, width=2, labelsize=14)
_=ax.set_xlim([0.5, 1.1])

fig

Comparison with Monte Carlo

In this section it is illustrated how severe interval uncertainty is underestimated by Monte Carlo. To show this, a signal with interval uncertainty ±0.5 is utilised and plotted.

pm = 0.5
sea_waves_interval_05 = intervalDFT.intervalize(sea_waves, pm)
sea_waves_interval_05.plot(xlabel='Time [s]',ylabel='Wave height [m]',title=r'Signal with $\pm$ '+str(pm)+' information gaps (intervals)')

fig

Generate some random signals between the bounds. All signals which are within or on the bounds are possible.

RAND_SIGNALS = sea_waves_interval_05.rand(N=20) # this picks out N (inner) random signals within the bounds

fig,ax = intervalDFT.subplots(figsize=(16,8))
for rs in RAND_SIGNALS:
    intervalDFT.plot_signal(rs,ax=ax)
sea_waves_interval_05.plot(ax=ax)
ax.grid()
_=ax.set_xlim(0,55) # underscore here is used to suppress the output of this line

fig

Computing the Fourier amplitude bounds and the periodogram of the interval signal

BI,BS = intervalDFT.compute_amplitude_bounds(sea_waves_interval_05)
BI.insert(0,int_num.Interval(0,0))
BS.insert(0,int_num.Interval(0,0))

BI = app.periodogram(BI, t, dt)
BS = app.periodogram(BS, t, dt) 

Plotting the bounds of the Fourier amplitude in comparison to the resulting bounds obtained by Monte Carlo

BI_low=[ai.lo() for ai in BI]
BI_high=[ai.hi() for ai in BI]
BS_low=[ai.lo() for ai in BS]
BS_high=[ai.hi() for ai in BS]

fig = pyplot.figure(figsize=(18,6))
ax = fig.subplots()
ax.grid()
ax.fill_between(x=w,y1=BI_low,y2=BI_high, alpha=0.4, label='Interval', edgecolor='blue', lw=2, color='cornflowerblue')
ax.fill_between(x=w,y1=BS_low,y2=BS_high, alpha=0.6, label='Selective', edgecolor='red', lw=2, color='orangered')

n_MC = 10
for x in range(n_MC):
    FX = intervalDFT.Fourier_amplitude(sea_waves_interval_05.rand())
    FX = app.periodogram(FX, t, dt)
    #intervalDFT.plot_line(w,FX,figsize=None,xlabel=r'#$x$',ylabel=r'$x$',color='palegreen',lw=1,title=None,ax=ax,label=None) 
    app.plot_line(w,FX,figsize=None,xlabel=r'#$x$',ylabel=r'$x$',color='#d7f4d7',lw=1,title=None,ax=ax,label=None) 

ax.set_xlabel('Frequency [rad/s]',fontsize=20)
ax.set_ylabel('Power Spectral Density [m$^2$s]',fontsize=20)
ax.set_title(r'Interval uncertainty: $\pm$ '+str(pm)+'', fontsize=20)

ax.tick_params(direction='out', length=6, width=2, labelsize=14)  

fig

Which increasing sample size, the range within the bounds of the interval signal is better covered. However, even a very high sample size is insufficient to get close to the bounds obtained by the interval DFT.

fig = pyplot.figure(figsize=(18,6))
ax = fig.subplots()
ax.grid()

n_MC = 1000
for x in range(n_MC):
    FX = intervalDFT.Fourier_amplitude(sea_waves_interval_05.rand())
    FX = app.periodogram(FX, t, dt)
    app.plot_line(w,FX,figsize=None,xlabel=r'#$x$',ylabel=r'$x$',color='#7cc47c',lw=1,title=None,ax=ax,label=None) 
    
n_MC = 100
for x in range(n_MC):
    FX = intervalDFT.Fourier_amplitude(sea_waves_interval_05.rand())
    FX = app.periodogram(FX, t, dt)
    app.plot_line(w,FX,figsize=None,xlabel=r'#$x$',ylabel=r'$x$',color='#a7d9a7',lw=1,title=None,ax=ax,label=None) 
    
n_MC = 10
for x in range(n_MC):
    FX = intervalDFT.Fourier_amplitude(sea_waves_interval_05.rand())
    FX = app.periodogram(FX, t, dt)
    app.plot_line(w,FX,figsize=None,xlabel=r'#$x$',ylabel=r'$x$',color='#d7f4d7',lw=1,title=None,ax=ax,label=None) 
    
ax.set_xlabel('Frequency [rad/s]',fontsize=20)
ax.set_ylabel('Power Spectral Density [m$^2$s]',fontsize=20)
_=ax.set_title('Bounds estimated by MC', fontsize=20) 

fig

This is an official implementation for "Self-Supervised Learning with Swin Transformers".

Self-Supervised Learning with Vision Transformers By Zhenda Xie*, Yutong Lin*, Zhuliang Yao, Zheng Zhang, Qi Dai, Yue Cao and Han Hu This repo is the

Swin Transformer 529 Jan 02, 2023
Classifies galaxy morphology with Bayesian CNN

Zoobot Zoobot classifies galaxy morphology with deep learning. This code will let you: Reproduce and improve the Galaxy Zoo DECaLS automated classific

Mike Walmsley 39 Dec 20, 2022
Facebook AI Research Sequence-to-Sequence Toolkit written in Python.

Fairseq(-py) is a sequence modeling toolkit that allows researchers and developers to train custom models for translation, summarization, language mod

20.5k Jan 08, 2023
[SIGIR22] Official PyTorch implementation for "CORE: Simple and Effective Session-based Recommendation within Consistent Representation Space".

CORE This is the official PyTorch implementation for the paper: Yupeng Hou, Binbin Hu, Zhiqiang Zhang, Wayne Xin Zhao. CORE: Simple and Effective Sess

RUCAIBox 26 Dec 19, 2022
Codes to pre-train T5 (Text-to-Text Transfer Transformer) models pre-trained on Japanese web texts

t5-japanese Codes to pre-train T5 (Text-to-Text Transfer Transformer) models pre-trained on Japanese web texts. The following is a list of models that

Kimio Kuramitsu 1 Dec 13, 2021
Code for the SIGGRAPH 2021 paper "Consistent Depth of Moving Objects in Video".

Consistent Depth of Moving Objects in Video This repository contains training code for the SIGGRAPH 2021 paper "Consistent Depth of Moving Objects in

Google 203 Jan 05, 2023
Key information extraction from invoice document with Graph Convolution Network

Key Information Extraction from Scanned Invoices Key information extraction from invoice document with Graph Convolution Network Related blog post fro

Phan Hoang 39 Dec 16, 2022
Hough Transform and Hough Line Transform Using OpenCV

Hough transform is a feature extraction method for detecting simple shapes such as circles, lines, etc in an image. Hough Transform and Hough Line Transform is implemented in OpenCV with two methods;

Happy N. Monday 3 Feb 15, 2022
A Machine Teaching Framework for Scalable Recognition

MEMORABLE This repository contains the source code accompanying our ICCV 2021 paper. A Machine Teaching Framework for Scalable Recognition Pei Wang, N

2 Dec 08, 2021
Point cloud processing tool library.

Point Cloud ToolBox This point cloud processing tool library can be used to process point clouds, 3d meshes, and voxels. Environment python 3.7.5 Dep

ZhangXinyun 40 Dec 09, 2022
CVPR 2020 oral paper: Overcoming Classifier Imbalance for Long-tail Object Detection with Balanced Group Softmax.

Overcoming Classifier Imbalance for Long-tail Object Detection with Balanced Group Softmax ⚠️ Latest: Current repo is a complete version. But we delet

FishYuLi 341 Dec 23, 2022
Object detection, 3D detection, and pose estimation using center point detection:

Objects as Points Object detection, 3D detection, and pose estimation using center point detection: Objects as Points, Xingyi Zhou, Dequan Wang, Phili

Xingyi Zhou 6.7k Jan 03, 2023
Unit-Convertor - Unit Convertor Built With Python

Python Unit Converter This project can convert Weigth,length and ... units for y

Mahdis Esmaeelian 1 May 31, 2022
Official implementation of paper Gradient Matching for Domain Generalization

Gradient Matching for Domain Generalisation This is the official PyTorch implementation of Gradient Matching for Domain Generalisation. In our paper,

94 Dec 23, 2022
Generative Adversarial Text-to-Image Synthesis

###Generative Adversarial Text-to-Image Synthesis Scott Reed, Zeynep Akata, Xinchen Yan, Lajanugen Logeswaran, Bernt Schiele, Honglak Lee This is the

Scott Ellison Reed 883 Dec 31, 2022
DeepRec is a recommendation engine based on TensorFlow.

DeepRec Introduction DeepRec is a recommendation engine based on TensorFlow 1.15, Intel-TensorFlow and NVIDIA-TensorFlow. Background Sparse model is a

Alibaba 676 Jan 03, 2023
Flexible Option Learning - NeurIPS 2021

Flexible Option Learning This repository contains code for the paper Flexible Option Learning presented as a Spotlight at NeurIPS 2021. The implementa

Martin Klissarov 7 Nov 09, 2022
NasirKhusraw - The TSP solved using genetic algorithm and show TSP path overlaid on a map of the Iran provinces & their capitals.

Nasir Khusraw : Travelling Salesman Problem The TSP solved using genetic algorithm. This project show TSP path overlaid on a map of the Iran provinces

J Brave 2 Sep 01, 2022
Offline Reinforcement Learning with Implicit Q-Learning

Offline Reinforcement Learning with Implicit Q-Learning This repository contains the official implementation of Offline Reinforcement Learning with Im

Ilya Kostrikov 125 Dec 31, 2022
official implemntation for "Contrastive Learning with Stronger Augmentations"

CLSA CLSA is a self-supervised learning methods which focused on the pattern learning from strong augmentations. Copyright (C) 2020 Xiao Wang, Guo-Jun

Lab for MAchine Perception and LEarning (MAPLE) 47 Nov 29, 2022