=================================================== Nitime: timeseries analysis for neuroscience data =================================================== Nitime contains a core of numerical algorithms for time-series analysis both in the time and spectral domains, a set of container objects to represent time-series, and auxiliary objects that expose a high level interface to the numerical machinery and make common analysis tasks easy to express with compact and semantically clear code. Website ======= Current information can always be found at the NIPY website is located here:: http://nipy.org/nitime Mailing Lists ============= Please see the developer's list here:: http://mail.scipy.org/mailman/listinfo/nipy-devel Code ==== You can find our sources and single-click downloads: * `Main repository`_ on Github. * Documentation_ for all releases and current development tree. * Download as a tar/zip file the `current trunk`_. * Downloads of all `available releases`_. .. _main repository: http://github.com/nipy/nitime .. _Documentation: http://nipy.org/nitime .. _current trunk: http://github.com/nipy/nitime/archives/master .. _available releases: http://github.com/nipy/nitime/downloads License information =================== Nitime is licensed under the terms of the new BSD license. See the file "LICENSE" for information on the history of this software, terms & conditions for usage, and a DISCLAIMER OF ALL WARRANTIES. All trademarks referenced herein are property of their respective holders. Copyright (c) 2006-2011, NIPY Developers All rights reserved.
Timeseries analysis for neuroscience data
Overview
Comments
-
MIssing plots in granger_fmri.html
There might be something not right about the last two figures here
http://nipy.org/nitime/examples/granger_fmri.html
Those are missing the body of the graph, which is all white.
I believe this is in the file
doc/examples/granger_fmri.py
. -
fail to estimate dpss_windows for long signals
I have times series with 166800 samples (raw MEG data).
alg.dpss_windows(166800, 4, 8)
fails. However, in Matlab it works.
any idea of how to fix this?
-
Latest release breaking Python 2.7, 3.4 (SyntaxError)
In nipype, tests are breaking in Python 2.7, 3.4, due to the
@
operator:./../../virtualenv/python2.7.15/lib/python2.7/site-packages/py/_path/local.py:668: in pyimport __import__(modname) nipype/interfaces/nitime/__init__.py:5: in <module> from .analysis import (CoherenceAnalyzerInputSpec, CoherenceAnalyzerOutputSpec, nipype/interfaces/nitime/analysis.py:28: in <module> package_check('nitime') nipype/utils/misc.py:180: in package_check mod = __import__(pkg_name) ../../../virtualenv/python2.7.15/lib/python2.7/site-packages/nitime/__init__.py:26: in <module> from . import algorithms ../../../virtualenv/python2.7.15/lib/python2.7/site-packages/nitime/algorithms/__init__.py:62: in <module> from nitime.algorithms.event_related import * E File "/home/travis/virtualenv/python2.7.15/lib/python2.7/site-packages/nitime/algorithms/event_related.py", line 60 E h = np.array(linalg.pinv(X.T @ X) @ X.T @ y.T) E ^ E SyntaxError: invalid syntax
-
pip install problem with numpy
This error happens with numpy installed. It also happens when building in readthedocs projects having nitime as a requirement and autodocs enabled.
Collecting numpy (from nitime) Using cached numpy-1.10.4-cp27-none-macosx_10_6_intel.macosx_10_9_intel.macosx_10_9_x86_64.macosx_10_10_intel.macosx_10_10_x86_64.whl Collecting nitime Downloading nitime-0.6.tar.gz (10.0MB) 100% |████████████████████████████████| 10.0MB 55kB/s Complete output from command python setup.py egg_info: Traceback (most recent call last): File "<string>", line 20, in <module> File "/private/var/folders/hw/7bn8tjn96vd58k7sg1ptt82c0000gn/T/pip-build-l6NHiN/nitime/setup.py", line 17, in <module> exec(f.read()) File "<string>", line 2, in <module> File "nitime/__init__.py", line 26, in <module> from . import algorithms File "nitime/algorithms/__init__.py", line 55, in <module> from nitime.algorithms.spectral import * File "nitime/algorithms/spectral.py", line 10, in <module> import numpy as np ImportError: No module named numpy ---------------------------------------- Command "python setup.py egg_info" failed with error code 1 in /private/var/folders/hw/7bn8tjn96vd58k7sg1ptt82c0000gn/T/pip-build-l6NHiN/nitime
-
Lazy imports
Here's a set of patches that make nitime imports faster and cleaner by deferring the matplotlib and scipy imports (in a few places) until they are actually needed.
Mostly, I want this so that I can
import nitime.timeseries
without pulling in matplotlib and scipy.Without this PR wall time for imports is 1.2 - 1.9 seconds:
16:[email protected](master)$ time python -c "import nitime" real 0m1.879s user 0m0.870s sys 0m0.415s
With this PR:
16:[email protected](lazy-imports)$ time python -c "import nitime" real 0m0.425s user 0m0.242s sys 0m0.111s
Which is pretty damn good, considering that on this system:
16:[email protected](lazy-imports)$ time python -c "import numpy" real 0m0.385s user 0m0.206s sys 0m0.108s
(In particular - there's about a 300ms advantage of lazyloading numpy.testing.nosetools!)
Here are some import analyses - using this lazy loading saves us from importing ~500 modules up front.
In [2]: import sys; snn = set([k for k in sys.modules]); len(snn) Out[2]: 461 # set of modules - no nitime In [3]: import nitime; sn = set([k for k in sys.modules]); len(sn) Out[3]: 629 # set of modules with nitime (this is 1102 without lazy loading) In [4]: nitime.test(); snt = set([k for k in sys.modules]); len(sn) Out[4]: 1540 # set of modules after nitime.test() (1542 without lazy loading)
The functionality of lazyimports.LazyImport is generic enough to allow the lazily imported module to act as the module in almost every way (tab completion, introspection for docstrings and sources) except reloading is not supported.
For skeptics - add a line such as
bogus.parameter : True
to the end of your ~/.matplotlib/matplotlibrc - which will cause aBad Key
user warning from matplotlib complaint on import. Then ::In [1]: import sys In [2]: import matplotlib.mlab as mlab Bad key "bogus.parameter" on line 374 in /home/pi/.matplotlib/matplotlibrc. You probably need to get an updated matplotlibrc file from http://matplotlib.sf.net/_static/matplotlibrc or from the matplotlib source distribution In [3]: mlab Out[3]: <module 'matplotlib.mlab' from '.../site-packages/matplotlib/mlab.pyc'> In [4]: mlab. Display all 107 possibilities? (y or n)n In [6]: [sys.modules.pop(k) for k in sys.modules.keys() if 'matplotlib' in k]; In [7]: from nitime.lazyimports import mlab In [8]: mlab Bad key "bogus.parameter" on line 374 in /home/pi/.matplotlib/matplotlibrc. You probably need to get an updated matplotlibrc file from http://matplotlib.sf.net/_static/matplotlibrc or from the matplotlib source distribution Out[8]: <module 'matplotlib.mlab' from '.../site-packages/matplotlib/mlab.pyc'> In [9]: mlab. Display all 107 possibilities? (y or n)n
In particular - note that for the lazy case, the actual import of matplotlib did not happen until after
In [8]
- which imported the module and called the repr on it.As a side note (on making
reload()
work): The following code (a bit more convoluted than what's in this PR) gets closer to being able to reload, but I haven't been able to figure out what machinery is missing to make it fully work. Perhaps @fperez has an idea, but it's not a big deal.import nitime.descriptors as desc from types import ModuleType as module class LazyImport(module): def __init__(self, modname): #module.__init__(self,modname,"foo") self.__lazyname__= modname self.__name__= modname @desc.auto_attr # one-time property def __lazyimported__(self): name = module.__getattribute__(self,'__lazyname__') return __import__(name, fromlist=name.split('.')) def __getattribute__(self,x): return module.__getattribute__(self,'__lazyimported__').__getattribute__(x) def __repr__(self): return module.__getattribute__(self,'__lazyimported__').__repr__()
-
Memory error of GrangerAnalyzer
Dear all, When I run a script like that:
>>>sampling_rate=1000 >>>freq_idx_G Out[7]: array([40, 41]) >>>G.frequencies.shape[0] Out[8]: 513 >>>g1 = np.mean(G.causality_xy[:, :, freq_idx_G], -1)
it met the following memory error (freq_idx_G=):
--------------------------------------------------------------------------- MemoryError Traceback (most recent call last) <ipython-input-6-b3dd332ebe13> in <module>() ----> 1 g1 = np.mean(G.causality_xy[:, :, freq_idx_G], -1) /home/qdong/Enthought/Canopy_64bit/User/lib/python2.7/site-packages/nitime/descriptors.pyc in __get__(self, obj, type) 138 # Errors in the following line are errors in setting a 139 # OneTimeProperty --> 140 val = self.getter(obj) 141 142 setattr(obj, self.name, val) /home/qdong/Enthought/Canopy_64bit/User/lib/python2.7/site-packages/nitime/analysis/granger.pyc in causality_xy(self) 202 @desc.setattr_on_read 203 def causality_xy(self): --> 204 return self._dict2arr('gc_xy') 205 206 @desc.setattr_on_read /home/qdong/Enthought/Canopy_64bit/User/lib/python2.7/site-packages/nitime/analysis/granger.pyc in _dict2arr(self, key) 191 arr = np.empty((self._n_process, 192 self._n_process, --> 193 self.frequencies.shape[0])) 194 195 arr.fill(np.nan) MemoryError:
Can anyone give me some tips? Thanks!
-
sphinx docs won't build (related to lazyimports?)
Running Sphinx v1.1.2 /Library/Frameworks/EPD64.framework/Versions/7.2/lib/python2.7/site-packages/matplotlib/init.py:908: UserWarning: This call to matplotlib.use() has no effect because the the backend has already been chosen; matplotlib.use() must be called before pylab, matplotlib.pyplot, or matplotlib.backends is imported for the first time.
if warn: warnings.warn(_use_error_msg) WARNING: extension 'ipython_console_highlighting' has no setup() function; is it really a Sphinx extension module? loading pickled environment... not yet created building [html]: targets for 71 source files that are out of date updating environment: 71 added, 0 changed, 0 removed /Users/arokem/projects/nitime/doc/sphinxext/docscrape.py:117: UserWarning: Unknown section Note warn("Unknown section %s" % key) /Users/arokem/projects/nitime/doc/sphinxext/docscrape.py:117: UserWarning: Unknown section Warning warn("Unknown section %s" % key) /Users/arokem/projects/nitime/doc/sphinxext/docscrape.py:117: UserWarning: Unknown section Example warn("Unknown section %s" % key) reading sources... [ 29%] api/generated/nitime.lazyimports
Exception occurred: File "/Library/Frameworks/EPD64.framework/Versions/7.2/lib/python2.7/site-packages/sphinx/environment.py", line 828, in read_doc pickle.dump(doctree, f, pickle.HIGHEST_PROTOCOL) PicklingError: Can't pickle <type 'module'>: attribute lookup builtin.module failed The full traceback has been saved in /var/folders/sf/3b6q6p1d7518rpb4882pzsxw0000gn/T/sphinx-err-9UpBlB.log, if you want to report the issue to the developers. Please also report this if it was a user error, so that a better error message can be provided next time. Either send bugs to the mailing list at http://groups.google.com/group/sphinx-dev/, or report them in the tracker at http://bitbucket.org/birkenfeld/sphinx/issues/. Thanks! make: *** [htmlonly] Error 1 -
Reorganization
This branch contains a major reorganization of algorithms.py as a sub-module of the library. The main idea is to change the layout of the library, making it slightly more developer-friendly. I am asking for a review of this, in the hopes of getting comments on the general structure. The idea is to adopt a similar structure for timeseries.py, utils.py, analysis.py and viz.py.
It also includes completion of 100% test coverage for almost all of the algorithms sub-module. Some of it is just smoke testing, but I have added quite a bit of actual tests for spectral and coherence, as well as for autoregressive.
The one bit that is still not entirely covered by the tests is algorithms.wavelet. I am not so sure how to use these functions. Maybe someone with a better idea (Kilian?) can take a look and add tests for this sub-module?
-
Fix according changed sphinx API
Hi,
The build fails with the following due to change in Sphinx's API change
add_directive
function. Log below:$ sphinx-build doc html-no-exec Running Sphinx v3.2.0 /home/nilesh/ups/nitime/doc/conf.py:34: MatplotlibDeprecationWarning: The mpl_toolkits.axes_grid module was deprecated in Matplotlib 2.1 and will be removed two minor releases later. Use mpl_toolkits.axes_grid1 and mpl_toolkits.axisartist, which provide the same functionality instead. __import__(package, fromlist=parts) WARNING: while setting up extension ipython_console_highlighting: extension 'ipython_console_highlighting' has no setup() function; is it really a Sphinx extension module? Exception occurred: File "/home/nilesh/ups/nitime/doc/sphinxext/only_directives.py", line 40, in setup app.add_directive('htmlonly', html_only_directive, True, (0, 0, 0)) TypeError: add_directive() takes from 3 to 4 positional arguments but 5 were given The full traceback has been saved in /tmp/sphinx-err-z9obztzm.log, if you want to report the issue to the developers. Please also report this if it was a user error, so that a better error message can be provided next time. A bug report can be filed in the tracker at <https://github.com/sphinx-doc/sphinx/issues>. Thanks!
This is an attempt to fix it along with adding in sphinx build to travis tests so as to ensure this works OK with future commits as well.
-
Deed description of document 'fmri_timeseries.csv'
Under nitime/data/, there is a fmri_timeseries.csv file with 31 different areas. Can you give me more information about this file? For example, how did the data come from, or where did it come from?
Thank you!
-
Timearray math
with this PR, adding and subtracting values which aren't TimeArrays first converts them and gives them the unit of the time array. For example
In [1]: import nitime
In [2]: nitime.TimeArray(1) + 1 Out[2]: 2.0 s
In [3]: nitime.TimeArray(1, time_unit='ms') + 1 Out[3]: 2.0 ms
In [4]: nitime.TimeArray(1, time_unit='ms') + 1 + nitime.TimeArray(1) Out[4]: 1002.0 ms
In [5]: a = nitime.TimeArray(1)
In [6]: a Out[6]: 1.0 s
In [7]: a.convert_unit('ms')
In [8]: a Out[8]: 1000.0 ms
In [9]: a+1 Out[9]: 1001.0 ms
-
`test_FilterAnalyzer` fails with scipy 1.8.0
Hi,
In maintaining the NixOS package for
nitime
we noticed that the testtest_FilterAnalyzer
fails once we bumpscipy
to 1.8.0:_____________________________ test_FilterAnalyzer ______________________________ def test_FilterAnalyzer(): """Testing the FilterAnalyzer """ t = np.arange(np.pi / 100, 10 * np.pi, np.pi / 100) fast = np.sin(50 * t) + 10 slow = np.sin(10 * t) - 20 fast_mean = np.mean(fast) slow_mean = np.mean(slow) fast_ts = ts.TimeSeries(data=fast, sampling_rate=np.pi) slow_ts = ts.TimeSeries(data=slow, sampling_rate=np.pi) #Make sure that the DC is preserved f_slow = nta.FilterAnalyzer(slow_ts, ub=0.6) f_fast = nta.FilterAnalyzer(fast_ts, lb=0.6) npt.assert_almost_equal(f_slow.filtered_fourier.data.mean(), slow_mean, decimal=2) npt.assert_almost_equal(f_slow.filtered_boxcar.data.mean(), slow_mean, decimal=2) npt.assert_almost_equal(f_slow.fir.data.mean(), slow_mean) npt.assert_almost_equal(f_slow.iir.data.mean(), slow_mean) npt.assert_almost_equal(f_fast.filtered_fourier.data.mean(), 10) npt.assert_almost_equal(f_fast.filtered_boxcar.data.mean(), 10, decimal=2) npt.assert_almost_equal(f_fast.fir.data.mean(), 10) npt.assert_almost_equal(f_fast.iir.data.mean(), 10) #Check that things work with a two-channel time-series: T2 = ts.TimeSeries(np.vstack([fast, slow]), sampling_rate=np.pi) f_both = nta.FilterAnalyzer(T2, ub=1.0, lb=0.1) #These are rather basic tests: npt.assert_equal(f_both.fir.shape, T2.shape) > npt.assert_equal(f_both.iir.shape, T2.shape)
Full build log available at https://hydra.nixos.org/log/12f43cyblp08zbjc5psd8ayxxmq3if72-python3.9-nitime-0.9.drv where all the python dependency versions can be seen. This is on an x86_64 linux system.
-
negative values in confidence interval of multi-taper coherence
First of all, I still need to read more carefully the references, so I might be wrong.
In the Multitaper coherence estimation tutorial, the confidence interval are computed (
t975_limit
andt025_limit
), but they are not printed or visualized in any way later. It turns out thatt025_limit
contains many negative values, but coherence is constrained to be within [0, 1].Is there anything going wrong?
-
tsa.periodogram() returns frequencies of all 0s when Fs=1.
Hi,
I noticed that the following code returns
freqs
which is all 0s:freqs, d_psd = tsa.periodogram(ar_seq, Fs=1., normalize=False)
I believe it is this line (in
algorithms/spectral.py
) that is causing the issue:freqs = np.linspace(0, Fs // 2, Fn)
Should it be
Fs / 2
instead?Version: nitime 0.9 Installed via conda
-
will it work for multivariate time series prediction both regression and classification
great code thanks may you clarify : will it work for multivariate time series prediction both regression and classification 1 where all values are continues values 2 or even will it work for multivariate time series where values are mixture of continues and categorical values for example 2 dimensions have continues values and 3 dimensions are categorical values
color weight gender height age
1 black 56 m 160 34 2 white 77 f 170 54 3 yellow 87 m 167 43 4 white 55 m 198 72 5 white 88 f 176 32
-
nitime not installing in Jupyter
Hello,
I installed nitime in general using the command window but afterwards was made aware that it has to be installed directly in Jupyter because otherwise it doesn't work there. However, when I try to install it in Jupyter (using: "! pip install nitime") , the kernel just remains busy and nothing happens (I gave it 3 hours). Weird thing is that I tried the same command to install another random package ("geocoder") en this immediately worked.
Does anyone know why nitime won't install?
Thanks in advance!
-
feature request: multiple `p` values for `detect_lines`
i'd like to perform harmonic analysis with two different p-values on the same signal. all other parameters the same. it's a huge waste to call
utils.detect_lines
twice i would think. FFT has to be done twice etc. is there a workaround where i can save partial results? how easy would it be to add a method which input multiple p-values? thanks!
Releases(rel/0.9)
-
rel/0.9(Dec 19, 2020)
Coursera Machine Learning - Python code
Coursera Machine Learning This repository contains python implementations of certain exercises from the course by Andrew Ng. For a number of assignmen
Model factory is a ML training platform to help engineers to build ML models at scale
Model Factory Machine learning today is powering many businesses today, e.g., search engine, e-commerce, news or feed recommendation. Training high qu
To-Be is a machine learning challenge on CodaLab Platform about Mortality Prediction
To-Be is a machine learning challenge on CodaLab Platform about Mortality Prediction. The challenge aims to adress the problems of medical imbalanced data classification.
fMRIprep Pipeline To Machine Learning
fMRIprep Pipeline To Machine Learning(Demo) 所有配置均在config.py文件下定义 前置环境(lilab) 各个节点均安装docker,并有fmripre的镜像 可以使用conda中的base环境(相应的第三份包之后更新) 1. fmriprep scr
Machine learning algorithms implementation
Machine learning algorithms implementation This repository consisits of implementation of various machine learning algorithms. The algorithms implemen
Simple linear model implementations from scratch.
Hand Crafted Models Simple linear model implementations from scratch. Table of contents Overview Project Structure Getting started Citing this project
Code for the TCAV ML interpretability project
Interpretability Beyond Feature Attribution: Quantitative Testing with Concept Activation Vectors (TCAV) Been Kim, Martin Wattenberg, Justin Gilmer, C
Repository for DCA0305, an undergraduate course about Machine Learning Workflows and Pipelines
Federal University of Rio Grande do Norte Technology Center Department of Computer Engineering and Automation Machine Learning Based Systems Design Re
Mesh TensorFlow: Model Parallelism Made Easier
Mesh TensorFlow - Model Parallelism Made Easier Introduction Mesh TensorFlow (mtf) is a language for distributed deep learning, capable of specifying
Implemented four supervised learning Machine Learning algorithms
Implemented four supervised learning Machine Learning algorithms from an algorithmic family called Classification and Regression Trees (CARTs), details see README_Report.
🔬 A curated list of awesome machine learning strategies & tools in financial market.
🔬 A curated list of awesome machine learning strategies & tools in financial market.
A machine learning project that predicts the price of used cars in the UK
Car Price Prediction Image Credit: AA Cars Project Overview Scraped 3000 used cars data from AA Cars website using Python and BeautifulSoup. Cleaned t
Magenta: Music and Art Generation with Machine Intelligence
Magenta is a research project exploring the role of machine learning in the process of creating art and music. Primarily this involves developing new
Avocado hass time series vs predict price
AVOCADO HASS TIME SERIES VÀ PREDICT PRICE Trước khi vào Heroku muốn giao diện đẹp mọi người chuyển giúp mình theo hình bên dưới https://avocado-hass.h
Continuously evaluated, functional, incremental, time-series forecasting
timemachines Autonomous, univariate, k-step ahead time-series forecasting functions assigned Elo ratings You can: Use some of the functionality of a s
Project to deploy a machine learning model based on Titanic dataset from Kaggle
kaggle_titanic_deploy Project to deploy a machine learning model based on Titanic dataset from Kaggle In this project we used the Titanic dataset from
The easy way to combine mlflow, hydra and optuna into one machine learning pipeline.
mlflow_hydra_optuna_the_easy_way The easy way to combine mlflow, hydra and optuna into one machine learning pipeline. Objective TODO Usage 1. build do
ML Kaggle Titanic Problem using LogisticRegrission
-ML-Kaggle-Titanic-Problem-using-LogisticRegrission here you will find the solution for the titanic problem on kaggle with comments and step by step c
Winning solution for the Galaxy Challenge on Kaggle
Winning solution for the Galaxy Challenge on Kaggle
Practical Time-Series Analysis, published by Packt
Practical Time-Series Analysis This is the code repository for Practical Time-Series Analysis, published by Packt. It contains all the supporting proj