neurodsp is a collection of approaches for applying digital signal processing to neural time series

Overview

Neuro Digital Signal Processing Toolbox

ProjectStatus Version BuildStatus Coverage License PythonVersions Publication

Tools to analyze and simulate neural time series, using digital signal processing.

Overview

neurodsp is a collection of approaches for applying digital signal processing to neural time series, including algorithms that have been proposed for the analysis of neural time series. It also includes simulation tools for generating plausible simulations of neural time series.

Available modules in NeuroDSP include:

  • filt : Filter data with bandpass, highpass, lowpass, or notch filters
  • burst : Detect bursting oscillations in neural signals
  • rhythm : Find and analyze rhythmic and recurrent patterns in time series
  • spectral : Compute spectral domain features such as power spectra
  • timefrequency : Estimate instantaneous measures of oscillatory activity
  • sim : Simulate time series, including periodic and aperiodic signal components
  • plts : Plotting functions

Documentation

Documentation for the NeuroDSP module is available here.

The documentation includes:

  • Tutorials: which describe and work through each module in NeuroDSP
  • Examples: demonstrating example applications and workflows
  • API List: which lists and describes all the code and functionality available in the module
  • Glossary: which defines all the key terms used in the module

If you have a question about using NeuroDSP that doesn't seem to be covered by the documentation, feel free to open an issue and ask!

Dependencies

NeuroDSP is written in Python, and requires Python >= 3.6 to run.

It has the following dependencies:

Optional dependencies:

  • pytest is needed if you want to run the test suite locally

We recommend using the Anaconda distribution to manage these requirements.

Install

The current major release of NeuroDSP is the 2.X.X series.

See the changelog for notes on major version releases.

Stable Release Version

To install the latest stable release, you can use pip:

$ pip install neurodsp

NeuroDSP can also be installed with conda, from the conda-forge channel:

$ conda install -c conda-forge neurodsp

Development Version

To get the current development version, first clone this repository:

$ git clone https://github.com/neurodsp-tools/neurodsp

To install this cloned copy, move into the directory you just cloned, and run:

$ pip install .

Editable Version

To install an editable version, download the development version as above, and run:

$ pip install -e .

Contribute

This project welcomes and encourages contributions from the community!

To file bug reports and/or ask questions about this project, please use the Github issue tracker.

To see and get involved in discussions about the module, check out:

  • the issues board for topics relating to code updates, bugs, and fixes
  • the development page for discussion of potential major updates to the module

When interacting with this project, please use the contribution guidelines and follow the code of conduct.

Reference

If you use this code in your project, please cite:

Cole, S., Donoghue, T., Gao, R., & Voytek, B. (2019). NeuroDSP: A package for
neural digital signal processing. Journal of Open Source Software, 4(36), 1272.
DOI: 10.21105/joss.01272

Direct Link: https://doi.org/10.21105/joss.01272

Bibtex:

@article{cole_neurodsp:_2019,
    title = {NeuroDSP: A package for neural digital signal processing},
    author = {Cole, Scott and Donoghue, Thomas and Gao, Richard and Voytek, Bradley},
    journal = {Journal of Open Source Software},
    year = {2019},
    volume = {4},
    number = {36},
    issn = {2475-9066},
    url = {https://joss.theoj.org/papers/10.21105/joss.01272},
    doi = {10.21105/joss.01272},
}

Funding

Supported by NIH award R01 GM134363 from the NIGMS.

https://www.nih.gov/sites/all/themes/nih/images/nih-logo-color.png


Comments
  • Variable names

    Variable names

    We don't have super consistent variable naming. With a breaking 1.0 release, should we go through and systematize it all to snake_case (so, for example, things like 'N_samples' -> 'n_samples').

    Also, some specific cases:

    • filter overwrites the python filter function, which can be really problematic.
      • Especially because NDSP by default imports all functions directly in local namespace, this squashes filter always, meaning you can never use Python filter, and so has side effects / breaks things
    • psd, as a function, strikes me as not a great name. Elsewhere, psd is used as a variable name, and it sounds like data more than a function
      • Heuristic: function names should be 'verb-y', because they do things.

    We also have a bunch of single character variables, but this (in some cases), might be fine for the math-y parts, and also some names, like 'Fs', that are non-compliant, but (I think), are inherited from style in scipy.signal, or similar. Do we want to update those names too?

    opened by TomDonoghue 21
  • [ENH] - Simulate fractional Gaussian noise and fractional Brownian motion.

    [ENH] - Simulate fractional Gaussian noise and fractional Brownian motion.

    This PR adds aperiodic simulations for fractional brownian motion and fractional gaussian noise for a given Hurst parameter in the range (0,1). Effectively this means we have another means of simulating aperiodic signals with a known power law exponent. The simulation of fractional gaussian noise use the Cholesky method for simplicity which runs in O(N^2) flops for a signal of length N. Fractional brownian motion is generated by calculating the cumulative sum of the corresponding fractional gaussian noise. For further details on simulating fractional brownian motion and fractional gaussian noise, see Ton Dieker's thesis, for example.

    Below is a minimal working example of simulations for standard brownian motion and positively correlated brownian motion. A simple linear regression is fit to the log-log power spectrum and the relative error of the slopes is compared to what the true values are.

    import numpy as np
    from neurodsp.spectral import compute_spectrum
    from neurodsp.plts import plot_time_series, plot_power_spectra
    from neurodsp.sim import sim_fbm
    from neurodsp.utils.data import create_times
    from fooof import FOOOF
    
    np.random.seed(0)
    
    n_seconds = 1
    fs = 2*10**3
    times = create_times(n_seconds, fs)
    
    # Simulate standard brownian motion (power law exponent = 2)
    # and positively correlated brownian motion (power law exponent = 2.5).
    # Recall that the power law exponent and the Hurst exponent are related by
    # beta = 2H + 1 for brownian motion.
    sig_bm = sim_fbm(n_seconds, fs, hurst=0.5)
    sig_pbm = sim_fbm(n_seconds, fs, hurst=0.75)
    plot_time_series(times, [sig_bm, sig_pbm], labels=["Brownian Motion", "Positively Correlated Brownian Motion"])
    
    # Plot power spectra
    freqs, pspec_bm = compute_spectrum(sig_bm, fs)
    _, pspec_pbm = compute_spectrum(sig_pbm, fs)
    plot_power_spectra(freqs, [pspec_bm, pspec_pbm], labels=["Brownian Motion", "Positively Correlated Brownian Motion"])
    
    # Use spectral fitting
    exp_bm, _ = np.polyfit(np.log10(freqs[1:]), np.log10(pspec_bm[1:]), 1)
    exp_pbm, _ = np.polyfit(np.log10(freqs[1:]), np.log10(pspec_pbm[1:]), 1)
    
    rel_err_bm = np.abs(-2 - exp_bm)/2
    rel_err_pbm = np.abs(-2.5 - exp_pbm)/2.5
    
    print("Relative error of estimated power law exponent for Brownian Motion: {:.2f}".format(rel_err_bm))
    print("Relative error of estimated power law exponent for positively correlated Brownian Motion: {:.2f}".format(rel_err_pbm))
    
    2.2 
    opened by elybrand 15
  • Refactor Filter Code

    Refactor Filter Code

    Responds to #99

    So I had a hack at the filter file / function, related to the discussion in #99.

    The sort of 'natural' organization that comes out of simply re-organizing the code as it was is to split up FIR & IIR filters, and also to split out some checking functions, as we as the compute transition band to their own function. As it is, this just moves around code, but doesn't copy any.

    With just a little more work, this will, I think, make the FIR & IIR functions directly usable, if one wants to skip right to them, and also the 'calc_transition_band' can definitely be refactored. There are a bunch of other small updates / changes that could be done too. But before I get too far down the rabbit hole, wanted to check in.

    What do y'all think of reorganizing in this way? cc: @srcole @rdgao

    Note: WIP - not ready to merge.

    opened by TomDonoghue 14
  • null valued array when putting filtered simulated oscillator into spectral.compute_spectrum

    null valued array when putting filtered simulated oscillator into spectral.compute_spectrum

    I am getting an array with null values as output when I put sim.sim_oscillator into filt.filter_signal and that into spectral.compute_spectrum. code example (with a bunch of printed arrays to show where the problem is):

    import numpy as np np.random.seed(0)

    from neurodsp import spectral, sim, filt

    %matplotlib inline import matplotlib.pyplot as plt import scipy as sp from scipy import signal

    n_samples_cycle = 100 fs = 1000 osc_freq = 6.5 fc = 20 oscA = sim.sim_oscillator(n_samples_cycle, fs, osc_freq, rdsym=.5) oscB = sim.sim_oscillator(n_samples_cycle, fs, osc_freq, rdsym=.05) oscC = filt.filter_signal(oscA, fs, 'lowpass', (None,fc)) oscD = filt.filter_signal(oscB, fs, 'lowpass', (None,fc))

    print("psdA:") print(oscA[:]) print("psdB:") print(oscB[:])

    Plot time series

    plt.figure(figsize=(24,3)) plt.plot(oscA, 'k', label='rdsym='+str(.5), alpha=.8) plt.plot(oscB, 'r', label='rdsym='+str(.3), alpha=.8) plt.plot(oscC, 'b', label='rdsym='+str(.5), alpha=.8) plt.plot(oscD, 'g', label='rdsym='+str(.3), alpha=.8) plt.ylim((-1.1, 1.7)) plt.xlim((0, 1000)) plt.legend() plt.xlabel('Time (sample)') plt.ylabel('Voltage')

    Plot power spectrum

    fA, psdA = spectral.compute_spectrum(oscA, 1000) fB, psdB = spectral.compute_spectrum(oscB, 1000) fC, psdC = spectral.compute_spectrum(oscC, 1000) fD, psdD = spectral.compute_spectrum(oscD, 1000) print("fA") print(fA[:]) print("fB") print(fB[:]) print("fC") print(fC[:]) print("fD") print(fD[:]) print("psdA") print(psdA[:]) print("psdB") print(psdB[:]) print("psdC") print(psdC[:]) print("psdD") print(psdD[:]) plt.figure(figsize=(20,5)) plt.loglog(fA, psdA, 'k', alpha=.5) plt.loglog(fB, psdB, 'r', alpha=.5) plt.loglog(fC, psdC, 'b', alpha=.5) plt.loglog(fD, psdD, 'g', alpha=.5) plt.xlabel('Frequency (Hz)') plt.ylabel('Power')

    opened by limwik 12
  • sim suggestions

    sim suggestions

    I'm starting to use neurodsp.sim, and have some possible suggestions:

    ToDos:

    • [x] Add in general colored noise generation (implementation available here: https://github.com/felixpatzelt/colorednoise) [Note: Richard added his own implementation].
    • [x] The 'Input Suggestion' docs for sim_oscillator or wrong / misplaced, right? Am I missing something? [Fixed: with more general doc clean ups.]
    • [x] I'll probably update some variable names, to update to our API conventions (ex - brownNf -> brown_nf or similar).

    Open Questions:

    • Is there any reason not to generalize all the functions that currently add 1/f^2 noise, to be able to add 1/f^n noise, with set-able n?
    • Mixed conventions: some functions take in a signal length, others a number of samples. We should consolidate on one approach - any suggestions on which is better?

    Anyways - I'll use this issue for continuing points of discussion, and open a PR soon-ish with some updates - so let me know of any thoughts about things here.

    opened by TomDonoghue 10
  • Add example using with MNE

    Add example using with MNE

    This is a draft of an example for using NDSP together with MNE, related to a suggestion on the JOSS review, and as mentioned in #143

    The broad outline is straightforward (grab the MNE sample data, and start messing with it), but I realized it's not so obvious what to do / show in this example.

    So far this is a fairly trivial example of extracting a channel of interest, and checking for bursts.

    The main thing we would want to show, though, is using NDSP + MNE across multiple channels, and probably in an event related manner, and focusing on custom and interesting NDSP specialties (not showing things that MNE already does well, such as filtering).

    So, what quick & straightforward analyses with NDSP would be most useful / interesting to show with an MNE organization / dataset? Ideally a specific but cool analysis on epoched data, across multiple electrodes (channel clusters).

    The example data is an audio-visual whereby subjects detect a visual stimulus presentation: https://martinos.org/mne/stable/manual/sample_dataset.html

    opened by TomDonoghue 9
  • Updates to setup, etc.

    Updates to setup, etc.

    Hey y'all,

    First off: sorry, I made a bit of a mess at first, by accidentally pushing changes straight to the voytekresearch repo instead of my fork (cloned the wrong one... ooops).

    Anyways - main repo has been stepped back to where it was.

    Suggested changes:

    • move version number inside the module, so it is accessible from the code (including a change to setup.py to read version number from the module, so that version is only specified in one place).
    • update setup.py: add more classifiers, and update install_requires
    • update README: add a couple more badges, and update the install instructions.

    Note: don't merge yet, I think there is a bit of work to sort out dependencies, and I'm not 100% sure my setup.py is currently totally proper.

    opened by TomDonoghue 9
  • DOC:

    DOC:

    I was poking around the codebase, checking through it a bit, and here are a bunch of small fixes.

    Almost entirely doc fixes:

    • Fix some spacing issues
    • Get better at line length (even if we haven't yet picked a standard, some of it was a bit excessive)
    • Small clean ups on var names, type checking, etc.
    • Docstring updates, to note when parameters are optional
    opened by TomDonoghue 9
  • [MNT] - Simulations refactors

    [MNT] - Simulations refactors

    This PR does some refactors and tweaks of some of the newer simulation functions.

    Naming

    An open question (nothing changed yet) is if we want to do any naming updates.

    • In particular, across the module we use both exponent (or exp), and now in these newer functions, chi to refer to the aperiodic exponent. I think I would vote we consolidate on using the term exponent consistently for this. The main counterpoint I can think of is that exponent is a pretty generic name, and in some cases it could be a little unclear.
    • While we're on the topic, I'm not 100% sold on sim_peak_oscillation, which seems to imply a slightly different thing from this being a "combined" signal. I'm not sure I have a better suggestion though...

    Thoughts?

    Optimizations

    For sim_knee and sim_peak_oscillation, there where a bunch of embedded loops that seemed optimizable - the updates here being to use sub-functions where they could be used and where appropriate, vectorize functions for applying across the vectors. Combining both also removed some interim computations of arrays. I also think the code might be a little clearer.

    Optimization updates:

    • sim_knee: speedup of ~20-25% faster of shorter signals (10s), and ~100% faster for longer signals (60s)
    • sim_peak_oscillation: speedup of ~20-25% faster of shorter signals (10s), and ~100% faster for longer signals (60s)

    All the actually computations are the same code, and I validated that the new versions give the same results

    @ryanhammonds - in terms of review, I'm fairly confident about the updates here, so they shouldn't need a wild amount of further testing for the code changes. Let me know if you have any thoughts on naming, and I'd also be curious to hear if you have any other optimization ideas.

    2.2 
    opened by TomDonoghue 8
  • [ENH] Simulate variable oscillations

    [ENH] Simulate variable oscillations

    This allows the frequency and/or simulation kwarg(s) to vary on a cycle-by-cycle basis.

    import numpy as np
    from neurodsp.sim import sim_variable_oscillation
    from neurodsp.plts import plot_time_series
    
    fs = 1000
    
    freqs = [ 5, 10, 15, 20]
    rdsyms= [.2, .4, .6, .8]
    
    sig = sim_variable_oscillation(fs, freqs, cycle='asine', rdsym=rdsyms)
    times = np.arange(0, len(sig)/fs, 1/fs)
    
    plot_time_series(times, sig)
    

    Screenshot_20210223_173332

    opened by ryanhammonds 7
  • [DOC] - Update filter tutorial(s)

    [DOC] - Update filter tutorial(s)

    This PR updates the filter documentation / tutorials, relating to #229.

    @ryanhammonds : following the issue, can you have a go working on this PR? I would suggest digging into the linked papers to guide some extra detail that can be added. If you can organize the new layout, add some detail and extend and you think is needed, I can come back to it after and edit. Thanks!

    documentation 2.2 
    opened by TomDonoghue 7
  • [MNT] - Update action version numbers

    [MNT] - Update action version numbers

    Updates CI tests for:

    • pinning the Ubuntu version, to maintain support of py3.6 (which fails on ubuntu-latest)
    • updates the versions of the actions (which addresses future deprecations)
    • adds testing and lists support for python 3.11
    opened by TomDonoghue 0
  • [FIX] - Add fix for special case of rdsym

    [FIX] - Add fix for special case of rdsym

    For sim_asine_cycle, there are special cases of rdsym value that can lead to an off by one error in terms computing the number of samples in the cycle. This comes from how we compute the number of samples based on rdsym value.

    For example, on main, the following code would get the wrong sample length (101 instead of 100): cyc = sim_asine_cycle2(0.1, 1000, 1., side='both')

    This can happen for 'peak' or 'both' when rdsym is 1., and for 'trough' when rdsym is 0.

    This PR adds a check and fix for this issue, as well as some extra tests to keep an eye on this issue (note that these tests would fail on current main branch).

    bug 
    opened by TomDonoghue 0
  • Converting knee param to knee freq (double exponential)

    Converting knee param to knee freq (double exponential)

    This issue is to pick up on a discussion that started in #290, for how to convert between the knee parameter and the knee frequency, specifically in the case of a double exponential model. To keep PRs a bit more modulate, this topic was split out from the rest of #290, which does more general refactors of the simulation functions.

    Notes on double-exp model

    The sim_knee function implements a double exponent + knee function, described as: It's this: L(freq) = 1 / (freq**(exponent1) * freq**(exponent2 + exponent1) + knee)

    This formulation uses the 'knee parameter', which isn't super easy to interpret. It would be nice to be able to use (and/or at least convert to) the knee frequency - the frequency at which the exponents change. This issue is to figure out how to do so (if it's even possible).

    A candidate for converting the knee frequency and knee parameter is the following (see below):

    knee_term = knee**(-2*exponent1 - exponent2)

    Knee Derivation (from Ryan)

    @ryanhammonds work on deriving the knee from: https://github.com/neurodsp-tools/neurodsp/pull/290#issuecomment-1027567198

    The knee is defined as: knee_freq = knee ** (1 / (2*exponent1 + exponent2))

    from solving for the fwhm in the Lorenztian:

    L(freq) = 1 / (freq**(exponent1) * freq**(exponent2 + exponent1) + knee)
    L(knee_freq) = f(0) / 2
    
    1 / (knee_freq**exponent1 * knee_freq**(exponent2 + exponent1)) + knee = 1 / (2 * knee)
    knee_freq**exponent1 * knee_freq**(exponent2 + exponent1) + knee = 2 * knee
    knee_freq**(2*exponent1 + exponent2) = knee
    knee_freq = knee ** (1 / (2*exponent1 + exponent2))
    

    I think this should be the analogous to solving knee_freq = knee**(1/exponent) in the single exponent model.

    Notes from Richard

    Note from Richard (https://github.com/neurodsp-tools/neurodsp/pull/290#issuecomment-1068223171): "at a quick glance, I think this might be a bit more complicated, though Ryan's derivations look good, at least in terms of the algebra. ... I'm actually not sure what the "right" answer for the knee_freq is when there's two slopes, bc it's not so easy to define where the tapering off happens when there's no flat plateau"

    Other notes

    In #290, there was an initial update for converting between knee parameter and knee frequency.

    See the following commits for this (from Ryan).

    • adding in the knee conversion: https://github.com/neurodsp-tools/neurodsp/pull/290/commits/ccc69f8a820125998893b914f9a8985addc372c5
    • updating tests for knee conversion: https://github.com/neurodsp-tools/neurodsp/pull/290/commits/0f3af9352d843315286b3dd5a1da06ee4bcccd05

    Note that to separate PR's, some of these changes were reverted in later commits in #290.

    2.3 
    opened by TomDonoghue 1
  • Possible addition: add

    Possible addition: add "anti-1/f" transform?

    In their recent paper, Samaha & Cohen introduce an "anti-1/f" transform. Maybe we could add an implementation of this?

    It's actually pretty simple - and something we have most of the tooling for - what they do is fit a 1/f line, then spectrally rotate the signal, reverting back to a timeseries. To add this, all we'd really need to do is add a helper function to combine spectral rotation with an estimate of the slope.

    They have a code available: https://osf.io/f8jqd/ Reference: https://www.sciencedirect.com/science/article/pii/S1053811922000581

    enhancement 
    opened by TomDonoghue 0
  • [ENH] - Add a function for simulating modulated oscillations

    [ENH] - Add a function for simulating modulated oscillations

    We currently simulate continuous and bursty oscillations, and allow for manipulating cycle by cycle, but don't have a function for simulating oscillations with some kind of (continuous) amplitude modulation.

    Ide: we could add a helper function for simulating amplitude modulated signals.

    Related note: neuro oscillations often have a 1/f spectrum of amplitude modulations (see Linkenkaer-Hansen work & related).

    See here for examples of amplitude-modulating signals: https://github.com/voytekresearch/ColourfulSounds/blob/master/Explorations-%20Amplitude%20Modulation.ipynb

    I think the implementation would be easy - simple add sim_modulated_oscillation, that would take inputs to pass through the sim_oscillation, as well as taking some extra parameters to define the modulating signal, which we then apply to the oscillatory signal, before returning. The amplitude modulation could be defined to be either periodic or aperiodic.

    opened by TomDonoghue 2
  • [ENH] Complete last cycle of bursts

    [ENH] Complete last cycle of bursts

    For bursty signals, the last cycle of each burst was incomplete by one sample. This updates fixes this by checking if the next cycle is not oscillating, and if so, adding the last sample to complete the cycle.

    bug 
    opened by ryanhammonds 3
Releases(2.2.1)
  • 2.2.1(Sep 27, 2022)

    Patch release version: 2.2.1

    This is a patch release (non-breaking) version of NeuroDSP, as part of the 2.X.X series.

    Changelog:

    Reverts a breaking change introduced in 2.2.0: neurodsp.utils.checks.check_param was deprecated and renamed to check_param_range, making 2.2.0 incompatible with bycycle 1.0.0. Since 2.2.0 was a minor, non-breaking release, check_param has been re-added and aliased to check_param_range. check_param should instead be fully deprecated at a future time, in 3.0.0, to ensure compatability across the 2.X.X version series.

    Source code(tar.gz)
    Source code(zip)
  • 2.2.0(Sep 21, 2022)

    Minor release version: 2.2.0

    This is a minor, non-breaking, release version of NeuroDSP, as part of the 2.X.X series.

    Changelog:

    Aperiodic

    • Detrended Fluctuation Analysis (#167)
    • IRASA (#212)
    • Autocorrelation (#217)

    Simulation

    • Fractional Gaussian noise and fractional Brownian motion (#216)
    • Burst detection bug fix (#220)
    • Time series with defined spectral parameters (i.e. height, width, center) (#221)
    • Time series with defined Lorentzian parameters (i.e. chi1, chi2, knee) (#222)
    • Asymmetrical gaussian cycles (#228)
    • Define bursts based on set durations (#239)
    • Extrema phase shifting (i.e. trough-to-trough or peak-to-peak) (#247)
    • Custom cycle support (#248)
    • Cycle length fix (#250)
    • Variable oscillations parameters (i.e. vary cycle simulation parameters on a cycle-by-cycle basis) (#252)
    • One-sided asymmetry (#254)
    • Simulate spikes of action potentials (#259)

    Plotting

    • 2d array plotting (time series: #246, spectra: #269)
    • Labeling bug fix and improved tests (#255)
    • Plotting saving updates (#258, #260)

    Maintenance

    • Deprecate increase_n argument of robust_hilbert (#215)
    • Improved speed of multidim decorator (#264)
    • Error for invalid transition bands (#267)

    Documentation

    • Tutorial for Morlet wavelets (#226)
    • Tutorials and updates for lagged coherence and sliding window matching (#230)
    • Tutorials for DFA, IRASA, and Autocorrelation (#231)
    Source code(tar.gz)
    Source code(zip)
  • 2.1.0(Jul 21, 2020)

    Minor release version: 2.1.0

    This is a minor, non-breaking, release version of NeuroDSP, as part of the 2.X.X series.

    Changelog:

    • A technical audit of the code was done, which updates technical approaches, including in filtering, time-frequency, and spectral modules
    • Documentation updates, including docstring examples & tutorial updates, including to be able to download example data from tutorials
    • An update to plot management and styling, increaing customizability of generated plots
    • Module organization, including re-organizing test file layout
    • Updating management of the documentation site, and updating sphinx versions & materials
    • Miscellaneous bug fixes
    Source code(tar.gz)
    Source code(zip)
  • 2.0.0(Sep 4, 2019)

    This is a release of the NeuroDSP, to the v2 series, starting at 2.0.0.

    WARNING: This is an API breaking update from the 1.X.X series.

    ChangeLog:

    • Major refactors of most of the code base, including splitting out many functions into subfunctions
    • An update and revamp of the names / vocabulary used, in particular in the sim module.
    • Reorganization of some code, including moving modules into their own folder, and grouping some things.
    • Major additions include extending the plotting utilities, and adding decorators for data normalization and running all functionality across 2D arrays.
    Source code(tar.gz)
    Source code(zip)
  • 1.1.2(Apr 12, 2019)

    This is an update of the v1.X.X series (from 1.0.X to 1.1.X) with some refactors and extensions (but no breaking changes). This is the final planned update of this series, before an API breaking v2 release.

    ChangeLog:

    • Code refactors, including splitting up the organization of the filt, spectral and sim modules, and updates to internal variable naming schemes.
    • Miscellaneous small bug fixes.
    • Documentation updates, including updates and additions to the docsite and tutorials / examples.
    • Project updates, including updates to the README, adding a CodeOfConduct, and adding the JOSS paper draft.

    Note: the .2 minor version is just due to a quirk when uploading to PYPI - there is no 1.1.0 or 1.1.1.

    Source code(tar.gz)
    Source code(zip)
  • 1.0.0(Nov 5, 2018)

    This is a major new release version, the first of the 1.X.X series, and a breaking update from the 0.X.X series.

    Several major API-breaking changes including:

    • waveform shape analysis has moved to bycycle.
    • enhanced features for simulating background aperiodic processes
    • many bug fixes
    Source code(tar.gz)
    Source code(zip)
  • 0.3.1(Jun 5, 2018)

  • 0.3(Apr 9, 2018)

    Many improvements have been made since the last release.

    1. Module to detect if an oscillator is present in a signal (burst.py)
    2. Module to simulate data with neural oscillations (sim.py)
    3. Improvements to waveform shape analysis (shape/cyclefeatures.py)

    Other updates included an updates SCV (spectral coefficient of variation) function, a dockerfile, updated documentation, and fixing a few miscellaneous bugs.

    Source code(tar.gz)
    Source code(zip)
  • 0.2(Sep 3, 2017)

    This new release has modules to characterize the power spectrum and phase-amplitude coupling. Several improvements (and bug fixes) have been made to some of the existing methods, particularly in the shape and filtering modules.

    Source code(tar.gz)
    Source code(zip)
  • 0.1(Jun 21, 2017)

    Includes tools for:

    • filtering
    • time-frequency analysis
    • lagged coherence measure of rhythmicity
    • characterizing the shape of oscillatory waveforms
    Source code(tar.gz)
    Source code(zip)
Owner
NeuroDSP
Digital signal processing for neuroscience.
NeuroDSP
A quick reference guide to the most commonly used patterns and functions in PySpark SQL

Using PySpark we can process data from Hadoop HDFS, AWS S3, and many file systems. PySpark also is used to process real-time data using Streaming and

Sundar Ramamurthy 53 Dec 21, 2022
Transpile trained scikit-learn estimators to C, Java, JavaScript and others.

sklearn-porter Transpile trained scikit-learn estimators to C, Java, JavaScript and others. It's recommended for limited embedded systems and critical

Darius Morawiec 1.2k Jan 05, 2023
Azure MLOps (v2) solution accelerators.

Azure MLOps (v2) solution accelerator Welcome to the MLOps (v2) solution accelerator repository! This project is intended to serve as the starting poi

Microsoft Azure 233 Jan 01, 2023
WAGMA-SGD is a decentralized asynchronous SGD for distributed deep learning training based on model averaging.

WAGMA-SGD is a decentralized asynchronous SGD based on wait-avoiding group model averaging. The synchronization is relaxed by making the collectives externally-triggerable, namely, a collective can b

Shigang Li 6 Jun 18, 2022
Banpei is a Python package of the anomaly detection.

Banpei Banpei is a Python package of the anomaly detection. Anomaly detection is a technique used to identify unusual patterns that do not conform to

Hirofumi Tsuruta 282 Jan 03, 2023
A collection of neat and practical data science and machine learning projects

Data Science A collection of neat and practical data science and machine learning projects Explore the docs » Report Bug · Request Feature Table of Co

Will Fong 2 Dec 10, 2021
Kalman filter library

The kalman filter framework described here is an incredibly powerful tool for any optimization problem, but particularly for visual odometry, sensor fusion localization or SLAM.

comma.ai 276 Jan 01, 2023
BASTA: The BAyesian STellar Algorithm

BASTA: BAyesian STellar Algorithm Current stable version: v1.0 Important note: BASTA is developed for Python 3.8, but Python 3.7 should work as well.

BASTA team 16 Nov 15, 2022
50% faster, 50% less RAM Machine Learning. Numba rewritten Sklearn. SVD, NNMF, PCA, LinearReg, RidgeReg, Randomized, Truncated SVD/PCA, CSR Matrices all 50+% faster

[Due to the time taken @ uni, work + hell breaking loose in my life, since things have calmed down a bit, will continue commiting!!!] [By the way, I'm

Daniel Han-Chen 1.4k Jan 01, 2023
The unified machine learning framework, enabling framework-agnostic functions, layers and libraries.

The unified machine learning framework, enabling framework-agnostic functions, layers and libraries. Contents Overview In a Nutshell Where Next? Overv

Ivy 8.2k Dec 31, 2022
TorchDrug is a PyTorch-based machine learning toolbox designed for drug discovery

A powerful and flexible machine learning platform for drug discovery

MilaGraph 1.1k Jan 08, 2023
SageMaker Python SDK is an open source library for training and deploying machine learning models on Amazon SageMaker.

SageMaker Python SDK SageMaker Python SDK is an open source library for training and deploying machine learning models on Amazon SageMaker. With the S

Amazon Web Services 1.8k Jan 01, 2023
PySurvival is an open source python package for Survival Analysis modeling

PySurvival What is Pysurvival ? PySurvival is an open source python package for Survival Analysis modeling - the modeling concept used to analyze or p

Square 265 Dec 27, 2022
Dive into Machine Learning

Dive into Machine Learning Hi there! You might find this guide helpful if: You know Python or you're learning it 🐍 You're new to Machine Learning You

Michael Floering 11.1k Jan 03, 2023
Kaggle Competition using 15 numerical predictors to predict a continuous outcome.

Kaggle-Comp.-Data-Mining Kaggle Competition using 15 numerical predictors to predict a continuous outcome as part of a final project for a stats data

moisey alaev 1 Dec 28, 2021
MLOps pipeline project using Amazon SageMaker Pipelines

This project shows steps to build an end to end MLOps architecture that covers data prep, model training, realtime and batch inference, build model registry, track lineage of artifacts and model drif

AWS Samples 3 Sep 16, 2022
AutoTabular automates machine learning tasks enabling you to easily achieve strong predictive performance in your applications.

AutoTabular automates machine learning tasks enabling you to easily achieve strong predictive performance in your applications. With just a few lines of code, you can train and deploy high-accuracy m

Robin 55 Dec 27, 2022
Warren - Stock Price Predictor

Web app to predict closing stock prices in real time using Facebook's Prophet time series algorithm with a multi-variate, single-step time series forecasting strategy.

Kumar Nityan Suman 153 Jan 03, 2023
Distributed Evolutionary Algorithms in Python

DEAP DEAP is a novel evolutionary computation framework for rapid prototyping and testing of ideas. It seeks to make algorithms explicit and data stru

Distributed Evolutionary Algorithms in Python 4.9k Jan 05, 2023
Continuously evaluated, functional, incremental, time-series forecasting

timemachines Autonomous, univariate, k-step ahead time-series forecasting functions assigned Elo ratings You can: Use some of the functionality of a s

Peter Cotton 343 Jan 04, 2023