PyAF is an Open Source Python library for Automatic Time Series Forecasting built on top of popular pydata modules.

Overview

PyAF (Python Automatic Forecasting)

CircleCI

PyAF is an Open Source Python library for Automatic Forecasting built on top of popular data science python modules: NumPy, SciPy, Pandas and scikit-learn.

PyAF works as an automated process for predicting future values of a signal using a machine learning approach. It provides a set of features that is comparable to some popular commercial automatic forecasting products.

PyAF has been developed, tested and benchmarked using a python 3.x version.

PyAF is distributed under the 3-Clause BSD license.

Demo

import numpy as np
import pandas as pd
import pyaf.ForecastEngine as autof

# generate a daily signal covering one year 2016 in a pandas dataframe
N = 360
df_train = pd.DataFrame({"Date": pd.date_range(start="2016-01-25", periods=N, freq='D'),
                         "Signal": (np.arange(N)//40 + np.arange(N) % 21 + np.random.randn(N))})

# create a forecast engine, the main object handling all the operations
lEngine = autof.cForecastEngine()

# get the best time series model for predicting one week
lEngine.train(iInputDS=df_train, iTime='Date', iSignal='Signal', iHorizon=7);
lEngine.getModelInfo() # => relative error 7% (MAPE)

# predict one week
df_forecast = lEngine.forecast(iInputDS=df_train, iHorizon=7)
# list the columns of the forecast dataset
print(df_forecast.columns)

# print the real forecasts
# Future dates : ['2017-01-19T00:00:00.000000000' '2017-01-20T00:00:00.000000000' '2017-01-21T00:00:00.000000000' '2017-01-22T00:00:00.000000000' '2017-01-23T00:00:00.000000000' '2017-01-24T00:00:00.000000000' '2017-01-25T00:00:00.000000000']
print(df_forecast['Date'].tail(7).values)

# signal forecast : [ 9.74934646  10.04419761  12.15136455  12.20369717  14.09607727 15.68086323  16.22296559]
print(df_forecast['Signal_Forecast'].tail(7).values)

also availabe as a jupyter notebook

Features

PyAF allows forecasting a time series (or a signal) for future values in a fully automated way. To build forecasts, PyAF allows using time information (by identifying long-term evolution and periodic patterns), analyzes the past of the signal, exploits exogenous data (user-provided time series that may be correlated with the signal) as well as the hierarchical structure of the signal (by aggregating spatial components forecasts, for example).

PyAF uses Pandas as a data access layer. It consumes data coming from a pandas data- frame (with time and signal columns), builds a time series model, and outputs the forecasts in a pandas data-frame. Pandas is an excellent data access layer, it allows reading/writing a huge set of file formats, accessing various data sources (databases) and has an extensive set of algorithms to handle data- frames (aggregation, statistics, linear algebra, plotting, etc.).

PyAF statistical time series models are built/estimated/trained using scikit-learn.

The following features are available :

  1. Training a model to forecast a time series (given in a pandas data-frame with time and signal columns).
    • PyAF uses a machine learning approach (the signal is cut into estimation and validation parts, respectively, 80% and 20% of the signal).
    • A time-series cross-validation can also be used.
  2. Forecasting a time series model on a given horizon (forecast result is also a pandas data-frame) and providing prediction/confidence intervals for the forecasts.
  3. Generic training features
    • Signal decomposition as the sum of a trend, periodic and AR components.
    • PyAF works as a competition between a comprehensive set of possible signal transformations and linear decompositions. For each transformed signal, a set of possible trends, periodic components and AR models is generated and all the possible combinations are estimated. The best decomposition in terms of performance is kept to forecast the signal (the performance is computed on a part of the signal that was not used for the estimation).
    • Signal transformation is supported before signal decompositions. Four transformations are supported by default. Other transformations are available (Box-Cox, etc.).
    • All models are estimated using standard procedures and state-of-the-art time series modeling. For example, trend regressions and AR/ARX models are estimated using scikit-learn linear regression models.
    • Standard performance measures are used (L1, RMSE, MAPE, MedAE, LnQ, etc.)
  4. PyAF analyzes the time variable and infers the frequency from the data.
    • Natural time frequencies are supported: Minute, Hour, Day, Week and Month.
    • Strange frequencies like every 3.2 days or every 17 minutes are supported if data are recorded accordingly (every other Monday => two weeks frequency).
    • The frequency is computed as the mean duration between consecutive observations by default (as a pandas DateOffset).
    • The frequency is used to generate values for future dates automatically.
    • PyAF does its best when dates are not regularly observed. Time frequency is approximate in this case.
    • Real/Integer valued (fake) dates are also supported and handled in a similar way.
  5. Exogenous Data Support
    • Exogenous data can be provided to improve the forecasts. These are expected to be stored in an external data-frame (this data-frame will be merged with the training data-frame).
    • Exogenous data are integrated into the modeling process through their past values (ARX models).
    • Exogenous variables can be of any type (numeric, string, date or object).
    • Exogenous variables are dummified for the non-numeric types, and standardized for the numeric types.
  6. PyAF implements Hierarchical Forecasting. It follows the excellent approach used in Rob J Hyndman and George Athanasopoulos book. Thanks @robjhyndman
    • Hierarchies and grouped time series are supported.
    • Bottom-Up, Top-Down (using proportions), Middle-Out and Optimal Combinations are implemented.
  7. The modeling process is customizable and has a huge set of options. The default values of these options should however be OK to produce a reasonable quality model in a limited amount of time (a few minutes).
    • These options give access to a full set of signal transformations and AR-like models that are not enabled by default.
    • Gives rise to Logit, Fisher transformations as well as XGBoost, Support Vector Regressions and Croston intermittent models, LGBM, among others.
    • By default, PyAF uses a fast mode that activates many popular models. It is also possible to activate a slow mode, in which PyAF explores all possible models.
    • Specific models and features can be customized.
  8. A benchmarking process is in place (using M1, M2, M3 competitions, NN3, NN5 forecasting competitions).
    • This process will be used to control the quality of modeling changes introduced in the future versions of PyAF. A related github issue is created.
    • Benchmarks data/reports are saved in a separate github repository.
    • Sample benchmark report with 1001 datasets from the M1 Forecasting Competition.
  9. Basic plotting functions using matplotlib with standard time series and forecasts plots.
  10. Software Quality Highlights
    • An object-oriented approach is used for the system design. Separation of concerns is the key factor here.
    • Fully written in Python with NumPy, SciPy, Pandas and scikit-learn objects. Tries to be column-based everywhere for performance reasons (respecting some modeling time and memory constraints).
    • Internally using a fit/predict pattern, inspired by scikit-learn, to estimate/forecast the different signal components (trends, cycles and AR models).
    • A test-driven approach (TDD) is used. Test scripts are available in the tests directory, one directory for each feature.
    • TDD implies that even the most recent features have some sample scripts in this directory. Want to know how to use cross-validation with PyAF? Here are some scripts.
    • Some jupyter notebooks are available for demo purposes with standard time series and forecasts plots.
    • Very simple API for training and forecasting.
  11. A basic RESTful Web Service (Flask) is available.
    • This service allows building a time series model, forecasting future data and some standard plots by providing a minimal specification of the signal in the JSON request body (at least a link to a csv file containing the data).
    • See this doc and the related github issue for more details.

PyAF is a work in progress. The set of features is evolving. Your feature requests, comments, help, hints are very welcome.

Installation

PyAF has been developed, tested and used on a python 3.x version.

It can be installed from PyPI for the latest official release:

pip install pyaf

The development version is also available by executing:

pip install scipy pandas sklearn matplotlib pydot dill sqlalchemy xgboost
pip install --upgrade git+git://github.com/antoinecarme/pyaf.git

Development

Code contributions are welcome. Bug reports, request for new features and documentation, tests are welcome. Please use the GitHub platform for these tasks.

You can check the latest sources of PyAF from GitHub with the command::

git clone http://github.com/antoinecarme/pyaf.git

Project history

This project was started in summer 2016 as a POC to check the feasibility of an automatic forecasting tool based only on Python available data science software (NumPy, SciPy, Pandas, scikit-learn, etc.).

See the AUTHORS.rst file for a complete list of contributors.

Help and Support

PyAF is currently maintained by the original developer. PyAF support will be provided when possible and even if you are not creating an issue, you are encouraged to follow these guidelines.

Bug reports, improvement requests, documentation, hints and test scripts are welcome. Please use the GitHub platform for these tasks.

Please don't ask too much about new features. PyAF is only about forecasting (the last F). To keep PyAF design simple and flexible, we avoid Feature Creep.

For your commercial forecasting projects, please consider using the services of a forecasting expert near you (be it an R or a Python expert).

Documentation

An introductory notebook to the time series forecasting with PyAF is available here. It contains some real-world examples and use cases.

A specific notebook describing the use of exogenous data is available here.

Notebooks describing an example of hierarchical forecasting models are available for Signal Hierarchies and for Grouped Signals.

The python code is not yet fully documented. This is a top priority (TODO).

Communication

Comments, appreciations, remarks, etc .... are welcome. Your feedback is welcome if you use this library in a project or a publication.

Comments
  • ModuleNotFoundError: No module named 'multiprocess'

    ModuleNotFoundError: No module named 'multiprocess'

    Hi There! Im doing some forecasting based on this package but having a problem and difficulties to make it run. Calling: lEngine.train(iInputDS = df, iTime = "datum", iSignal = "krank_pct", iHorizon = 12); datum = datetime column "YYYY-MM-DD" krank_pct = numeric iHorizon = 12 for months because i have a monthly data

    The call throws me an error: ModuleNotFoundError: No module named 'multiprocess' I just dont know where the mistake lies. Can someone maybe help me to make it run?

    Thanks a lot in advance! M

    opened by maki-markie 48
  • Zero division error - cannot figure out source.

    Zero division error - cannot figure out source.

    Hello,

    I have some sparse hierarchical data that I am running through pyaf. None of my individual timeseries are entirely 0, yet I'm getting a divide by zero error when trying to run lEngine.train on my dataset.

    ---------------------------------------------------------------------------
    ZeroDivisionError                         Traceback (most recent call last)
    ~/anaconda3/envs/tf-gpu/lib/python3.6/site-packages/pyaf/HierarchicalForecastEngine.py in train(self, iInputDS, iTime, iSignal, iHorizon, iHierarchy, iExogenousData)
         22         try:
    ---> 23             self.train_HierarchicalModel(iInputDS, iTime, iSignal, iHorizon, iHierarchy, iExogenousData);
         24         except tsutil.PyAF_Error as error:
    
    ~/anaconda3/envs/tf-gpu/lib/python3.6/site-packages/pyaf/HierarchicalForecastEngine.py in train_HierarchicalModel(self, iInputDS, iTime, iSignal, iHorizon, iHierarchy, iExogenousData)
         93         self.mSignalHierarchy = lSignalHierarchy;
    ---> 94         self.mSignalHierarchy.fit();
         95 
    
    ~/anaconda3/envs/tf-gpu/lib/python3.6/site-packages/pyaf/TS/SignalHierarchy.py in fit(self)
        186         self.create_all_levels_models(lAllLevelsDataset, self.mHorizon, self.mDateColumn);
    --> 187         self.computeTopDownHistoricalProportions(lAllLevelsDataset);
        188         lForecast_DF = self.internal_forecast(self.mTrainingDataset , self.mHorizon)
    
    ~/anaconda3/envs/tf-gpu/lib/python3.6/site-packages/pyaf/TS/SignalHierarchy.py in computeTopDownHistoricalProportions(self, iAllLevelsDataset)
        273                         self.mAvgHistProp[col][col1] = (lEstim[col1] / lEstim[col]).mean();
    --> 274                         self.mPropHistAvg[col][col1] = lEstim[col1].mean() / lEstim[col].mean();
        275         # print("AvgHitProp\n", self.mAvgHistProp);
    
    ZeroDivisionError: float division by zero
    

    Any ideas as to what this might be or how to debug the issue in my dataset or the code itself?

    opened by andmib 24
  • Projections Wrongfully Linear

    Projections Wrongfully Linear

    I am training a basic model that is comparing weight lifted vs. time.

    As you will notice, the timeline is pretty limited, but this will likely be the case in most of my uses. The visual (shown below) is linear, which is obviously incorrect.

    I am not too advanced in Python or forecasting, but visually, something looks wrong. Here is my full code, which includes data:

    import pandas as pd
    import ssl
    import datetime
    import matplotlib.pyplot as plt
    import pyaf.ForecastEngine as autof
    import numpy
    
    temp_data = [
    	
    	
    	{
    		"weight" : 185.0,
    		"date" : "2021-11-19"
    	},
    	{
    		"weight" : 165.0,
    		"date" : "2021-11-22"
    	},
    	{
    		"weight" : 145.0,
    		"date" : "2021-11-28"
    	},
    	{
    		"weight" : 175.0,
    		"date" : "2021-12-01"
    	},
    	
    	{
    		"weight" : 145.0,
    		"date" : "2021-12-08"
    	},
    	{
    		"weight" : 150.0,
    		"date" : "2021-12-12"
    	},
    	{
    		"weight" : 190.0,
    		"date" : "2021-12-18"
    	},
    	{
    		"weight" : 200.0,
    		"date" : "2021-12-24"
    	},
    	{
    		"weight" : 180.0,
    		"date" : "2021-12-27"
    	},
    	{
    		"weight" : 175.0,
    		"date" : "2022-01-01"
    	},
    	{
    		"weight" : 160.0,
    		"date" : "2022-01-05"
    	},
    ]
    
    #data = numpy.toarray(temp_data)
    
    if __name__ == '__main__':
    	weight_dataframe = pd.DataFrame(temp_data)
    	print(weight_dataframe)
    	weight_dataframe['date'] = weight_dataframe['date'].apply(lambda x : datetime.datetime.strptime(x, "%Y-%m-%d"))
    	weight_dataframe.head()
    
    	lEngine = autof.cForecastEngine();
    	lEngine.train(weight_dataframe , 'date' , 'weight', 50);
    	weight_forecast_dataframe = lEngine.forecast(weight_dataframe, 50);
    	lEngine.getModelInfo() # => relative error 7% (MAPE)
    
    	#print(weight_forecast_dataframe)
    	weight_forecast_dataframe.plot.line('date', ['weight', 'weight_Forecast_Upper_Bound', 'weight_Forecast_Quantile_50', 'weight_Forecast_Lower_Bound'], grid = True, figsize=(12, 8), marker = 'o', color = ['#A1A5FF', 'green', 'blue', 'red'], title = 'Bench Press Projections');
    	plt.legend(['Previous Weight', 'Max Projected Weight', 'Median Projected Weight', 'Min Projected Weight'])
    	plt.ylabel('Weight')
    	plt.xlabel('Date')
    	plt.show()
    

    Here is a visual output:

    Screen Shot 2022-11-26 at 1 09 46 AM

    Here is my system info as requested:

    /Library/Frameworks/Python.framework/Versions/3.11/lib/python3.11/site-packages/_distutils_hack/init.py:33: UserWarning: Setuptools is replacing distutils. warnings.warn("Setuptools is replacing distutils.") PYAF_SYSTEM_DEPENDENT_VERSION_INFO ('Cython_version', 'NOT_INSTALLED') PYAF_SYSTEM_DEPENDENT_VERSION_INFO ('dill_version', '0.3.6') PYAF_SYSTEM_DEPENDENT_VERSION_INFO ('keras_version', 'NOT_INSTALLED') PYAF_SYSTEM_DEPENDENT_VERSION_INFO ('lightgbm_version', 'NOT_INSTALLED') PYAF_SYSTEM_DEPENDENT_VERSION_INFO ('matplotlib_version', '3.6.2') PYAF_SYSTEM_DEPENDENT_VERSION_INFO ('numpy_version', '1.23.5') PYAF_SYSTEM_DEPENDENT_VERSION_INFO ('pandas_version', '1.5.2') PYAF_SYSTEM_DEPENDENT_VERSION_INFO ('pathos_version', 'NOT_INSTALLED') PYAF_SYSTEM_DEPENDENT_VERSION_INFO ('pip_version', '22.3') PYAF_SYSTEM_DEPENDENT_VERSION_INFO ('pyaf_version', '4.0') PYAF_SYSTEM_DEPENDENT_VERSION_INFO ('pydot_version', '1.4.2') PYAF_SYSTEM_DEPENDENT_VERSION_INFO ('python_implementation', 'CPython') PYAF_SYSTEM_DEPENDENT_VERSION_INFO ('python_version', '3.11.0') PYAF_SYSTEM_DEPENDENT_VERSION_INFO ('scipy_version', '1.9.3') PYAF_SYSTEM_DEPENDENT_VERSION_INFO ('setuptools_version', '65.5.0') PYAF_SYSTEM_DEPENDENT_VERSION_INFO ('sklearn_version', '1.1.3') PYAF_SYSTEM_DEPENDENT_VERSION_INFO ('skorch_version', 'NOT_INSTALLED') PYAF_SYSTEM_DEPENDENT_VERSION_INFO ('sqlalchemy_version', '1.4.44') PYAF_SYSTEM_DEPENDENT_VERSION_INFO ('system_platform', 'macOS-12.5-arm64-arm-64bit') PYAF_SYSTEM_DEPENDENT_VERSION_INFO ('system_processor', 'arm') PYAF_SYSTEM_DEPENDENT_VERSION_INFO ('system_uname', uname_result(system='Darwin', node='MacBook-Pro.local', release='21.6.0', version='Darwin Kernel Version 21.6.0: Sat Jun 18 17:07:22 PDT 2022; root:xnu-8020.140.41~1/RELEASE_ARM64_T6000', machine='arm64')) PYAF_SYSTEM_DEPENDENT_VERSION_INFO ('torch_version', 'NOT_INSTALLED') PYAF_SYSTEM_DEPENDENT_VERSION_INFO ('xgboost_version', '1.7.1') PYAF_SYSTEM_DEPENDENT_ENVIRONMENT_VARIABLE ('COLORTERM', 'truecolor') PYAF_SYSTEM_DEPENDENT_ENVIRONMENT_VARIABLE ('COMMAND_MODE', 'unix2003') PYAF_SYSTEM_DEPENDENT_ENVIRONMENT_VARIABLE ('GIT_ASKPASS', '/private/var/folders/_v/tdvwxstj3ljd7x9hdh16s8kc0000gn/T/AppTranslocation/98905D2F-13A3-4069-B8FB-27DEDF170F99/d/Visual Studio Code.app/Contents/Resources/app/extensions/git/dist/askpass.sh') PYAF_SYSTEM_DEPENDENT_ENVIRONMENT_VARIABLE ('HOME', '/Users/brandonjakobson') PYAF_SYSTEM_DEPENDENT_ENVIRONMENT_VARIABLE ('KMP_DUPLICATE_LIB_OK', 'True') PYAF_SYSTEM_DEPENDENT_ENVIRONMENT_VARIABLE ('KMP_INIT_AT_FORK', 'FALSE') PYAF_SYSTEM_DEPENDENT_ENVIRONMENT_VARIABLE ('LANG', 'en_US.UTF-8') PYAF_SYSTEM_DEPENDENT_ENVIRONMENT_VARIABLE ('LOGNAME', 'brandonjakobson') PYAF_SYSTEM_DEPENDENT_ENVIRONMENT_VARIABLE ('MallocNanoZone', '0') PYAF_SYSTEM_DEPENDENT_ENVIRONMENT_VARIABLE ('OLDPWD', '/Users/brandonjakobson/Downloads/WorkoutProjections') PYAF_SYSTEM_DEPENDENT_ENVIRONMENT_VARIABLE ('ORIGINAL_XDG_CURRENT_DESKTOP', 'undefined') PYAF_SYSTEM_DEPENDENT_ENVIRONMENT_VARIABLE ('PATH', '/opt/homebrew/bin:/usr/local/bin:/usr/bin:/bin:/usr/sbin:/sbin:/Applications/VMware') PYAF_SYSTEM_DEPENDENT_ENVIRONMENT_VARIABLE ('PWD', '/Users/brandonjakobson/Downloads/WorkoutProjections') PYAF_SYSTEM_DEPENDENT_ENVIRONMENT_VARIABLE ('SHELL', '/bin/zsh') PYAF_SYSTEM_DEPENDENT_ENVIRONMENT_VARIABLE ('SHLVL', '1') PYAF_SYSTEM_DEPENDENT_ENVIRONMENT_VARIABLE ('SSH_AUTH_SOCK', '/private/tmp/com.apple.launchd.vZZcYkY6Qx/Listeners') PYAF_SYSTEM_DEPENDENT_ENVIRONMENT_VARIABLE ('TERM', 'xterm-256color') PYAF_SYSTEM_DEPENDENT_ENVIRONMENT_VARIABLE ('TERM_PROGRAM', 'vscode') PYAF_SYSTEM_DEPENDENT_ENVIRONMENT_VARIABLE ('TERM_PROGRAM_VERSION', '1.73.0') PYAF_SYSTEM_DEPENDENT_ENVIRONMENT_VARIABLE ('TMPDIR', '/var/folders/_v/tdvwxstj3ljd7x9hdh16s8kc0000gn/T/') PYAF_SYSTEM_DEPENDENT_ENVIRONMENT_VARIABLE ('USER', 'brandonjakobson') PYAF_SYSTEM_DEPENDENT_ENVIRONMENT_VARIABLE ('USER_ZDOTDIR', '/Users/brandonjakobson') PYAF_SYSTEM_DEPENDENT_ENVIRONMENT_VARIABLE ('VSCODE_GIT_ASKPASS_EXTRA_ARGS', '--ms-enable-electron-run-as-node') PYAF_SYSTEM_DEPENDENT_ENVIRONMENT_VARIABLE ('VSCODE_GIT_ASKPASS_MAIN', '/private/var/folders/_v/tdvwxstj3ljd7x9hdh16s8kc0000gn/T/AppTranslocation/98905D2F-13A3-4069-B8FB-27DEDF170F99/d/Visual Studio Code.app/Contents/Resources/app/extensions/git/dist/askpass-main.js') PYAF_SYSTEM_DEPENDENT_ENVIRONMENT_VARIABLE ('VSCODE_GIT_ASKPASS_NODE', '/private/var/folders/_v/tdvwxstj3ljd7x9hdh16s8kc0000gn/T/AppTranslocation/98905D2F-13A3-4069-B8FB-27DEDF170F99/d/Visual Studio Code.app/Contents/Frameworks/Code Helper.app/Contents/MacOS/Code Helper') PYAF_SYSTEM_DEPENDENT_ENVIRONMENT_VARIABLE ('VSCODE_GIT_IPC_HANDLE', '/var/folders/v/tdvwxstj3ljd7x9hdh16s8kc0000gn/T/vscode-git-810feb144a.sock') PYAF_SYSTEM_DEPENDENT_ENVIRONMENT_VARIABLE ('VSCODE_INJECTION', '1') PYAF_SYSTEM_DEPENDENT_ENVIRONMENT_VARIABLE ('XPC_FLAGS', '0x0') PYAF_SYSTEM_DEPENDENT_ENVIRONMENT_VARIABLE ('XPC_SERVICE_NAME', '0') PYAF_SYSTEM_DEPENDENT_ENVIRONMENT_VARIABLE ('ZDOTDIR', '/Users/brandonjakobson') PYAF_SYSTEM_DEPENDENT_ENVIRONMENT_VARIABLE ('', '/usr/local/bin/python3') PYAF_SYSTEM_DEPENDENT_ENVIRONMENT_VARIABLE ('__CFBundleIdentifier', 'com.microsoft.VSCode') PYAF_SYSTEM_DEPENDENT_ENVIRONMENT_VARIABLE ('__CF_USER_TEXT_ENCODING', '0x1F5:0x0:0x0')

    topic:user_support 
    opened by bjakobson 22
  • Investigate PyTorch-based LSTM and MLP models

    Investigate PyTorch-based LSTM and MLP models

    PyAF uses Google Keras/Tensorflow to implement LSTM and MLP models.

    It is interesting to be able to use Facebook PyTorch when available. PyTorch is more widely available, and more "open-source".

    https://pytorch.org/

    No GPU/TPU support is needed. PyAF does not need That much computing power.

    Impact : Keras/Tensorflow will be used only when no PyTroch is available. Impacts only LSTM and MLP models which are not enabled by default.

    Easy to fix.

    Target Release : 2022-07-14

    class:enhancement priority:normal topic:modeling_quality topic:neural_net status:in_progress 
    opened by antoinecarme 12
  • Add LnQ Performance measure

    Add LnQ Performance measure

    According to :

    https://en.wikipedia.org/wiki/Symmetric_mean_absolute_percentage_error

    A limitation to SMAPE is that if the actual value or forecast value is 0, the value of error will boom up to the upper-limit of error. (200% for the first formula and 100% for the second formula).

    Provided the data are strictly positive, a better measure of relative accuracy can be obtained based on the log of the accuracy ratio: log(Ft / At) This measure is easier to analyse statistically, and has valuable symmetry and unbiasedness properties. When used in constructing forecasting models the resulting prediction corresponds to the geometric mean (Tofallis, 2015).

    class:enhancement priority:normal topic:modeling_quality 
    opened by antoinecarme 12
  • PyAF Powerpc support (IBM S822xx)

    PyAF Powerpc support (IBM S822xx)

    Someone asked for PyAF powerpc support (IBM S822xx). This can be seen as a software robustness/portability test for PyAF.

    Need to run all PyAF tests on a Debian Linux PPC64 on at least IBM Power8 CPU.

    1. Check that all needed data packages are available (point missing packages).
    2. Run build tests (make -f tests/Makefilebuild-tests), travis-ci equivalent
    3. Run extended tests (make -f tests/Makefile all)
    4. Analyze numerical differences if any.
    5. Big Endian / Little Endian ?

    The hardware is made available through the excellent Minicloud, The FREE OpenPower Cloud by Unicamp , Campinas, Sao Paulo, Brazil

    https://openpower.ic.unicamp.br

    Big Thank you, @Unicamp-OpenPower !!!

    priority:high topic:generic class:devops 
    opened by antoinecarme 11
  • Add temporal hierarchical forecasting

    Add temporal hierarchical forecasting

    opened by antoinecarme 11
  • Add the possibility to use cross validation when training PyAF models

    Add the possibility to use cross validation when training PyAF models

    Following the investigation performed in #53, implement a form of cross validation for PyAF models.

    Specifications :

    1. Cut the dataset in many folds according to a scikit-learn time series split : http://scikit-learn.org/stable/modules/cross_validation.html#cross-validation number of folds => user option (default = 10)

    2. To have enough data, use only the last n/2 folds for estimating the models (thanks to forecast R package ;). The default splits look like this : [5 ] [6] [5 6 ] [7] [5 6 7] [8] [5 6 7 8] [9] [5 6 7 8 9] [10]

    3. Use the model decomposition type or formula as a hyperparameter and optimize it. select the decomposition(s) with the lowest mean MAPE on the validation datasets of all the possible splits.

    4. Among all the chosen decompositions, select the model with lowest complexity (~ number of inputs)

    5. Execute the procedure on the ozone and air passengers datsets and compare with the non-cross validation models (=> 2 jupyter notebooks)

    class:enhancement priority:high topic:modeling_quality 
    opened by antoinecarme 10
  • Dataset containing high precision (nanoseconds) dates fails to train

    Dataset containing high precision (nanoseconds) dates fails to train

    I was testing on some data, and I kept getting exceptions saying train failed, after looking around I realized it was because of the high precision dates.

    
    INFO:pyaf.std:START_TRAINING 'value'
    Traceback (most recent call last):
      File "C:\PYTHON3\lib\site-packages\pyaf\ForecastEngine.py", line 25, in train
        self.mSignalDecomposition.train(iInputDS, iTime, iSignal, iHorizon, iExogenousData);
      File "C:\PYTHON3\lib\site-packages\pyaf\TS\SignalDecomposition.py", line 631, in train
        self.checkData(iInputDS, iTime, iSignal, iHorizon, iExogenousData);
      File "C:\PYTHON3\lib\site-packages\pyaf\TS\SignalDecomposition.py", line 604, in checkData
        type1 = np.dtype(iInputDS[iTime])
    TypeError: Cannot interpret '0     2021-08-01 00:14:36.879515613+00:00
    1     2021-08-01 00:13:22.755664335+00:00
    2     2021-08-01 00:12:08.483382948+00:00
    3     2021-08-01 00:10:54.242433585+00:00
    4     2021-08-01 00:09:40.135882425+00:00
                          ...                
    115   2021-07-31 21:51:43.580248426+00:00
    116   2021-07-31 21:50:29.020741582+00:00
    117   2021-07-31 21:49:15.175994058+00:00
    118   2021-07-31 21:48:00.528170592+00:00
    119   2021-07-31 21:46:46.214305238+00:00
    Name: date, Length: 120, dtype: datetime64[ns, UTC]' as a data type
    
    During handling of the above exception, another exception occurred:
    
    Traceback (most recent call last):
      File "<string>", line 1, in <module>
      File "C:\PYTHON3\lib\site-packages\pyaf\ForecastEngine.py", line 30, in train
        raise tsutil.PyAF_Error("TRAIN_FAILED");
    pyaf.TS.Utils.PyAF_Error: TRAIN_FAILED
    

    I changed the precision by casting my dates up to seconds and then train worked fine: df['date'] = df['date'].values.astype('<M8[s]') Seems that the underlying problem is some numpy function, not too sure..

    opened by artrune 9
  • Crash when training using example code - An attempt has been made to start a new process before the current process has finished its bootstrapping phase

    Crash when training using example code - An attempt has been made to start a new process before the current process has finished its bootstrapping phase

    I have pyaf installed and am attempted to run the example code over at https://pypi.org/project/pyaf/

    Here's the code:

    import numpy as np
    import pandas as pd
    import pyaf.ForecastEngine as autof
    
    N = 360
    df_train = pd.DataFrame({"Date" : pd.date_range(start="2016-01-25", periods=N, freq='D'), "Signal" : (np.arange(N)//40 + np.arange(N) % 21 + np.random.randn(N))})
    lEngine = autof.cForecastEngine()
    lEngine.train(iInputDS = df_train, iTime = 'Date', iSignal = 'Signal', iHorizon = 7)
    
    

    It fails on the last line with this error trace, multiple times (repeatedly, until I stop it):

    Traceback (most recent call last):
    self.mSignalDecomposition.train(iInputDS, iTime, iSignal, iHorizon, iExogenousData);INFO:pyaf.std:START_TRAINING 'Signal'
    INFO:pyaf.std:START_TRAINING 'Signal'
      File "pathtomyfolder\.venv\lib\site-packages\pyaf\TS\SignalDecomposition.py", line 322, in train
      File "pathtomyfolder\.venv\lib\site-packages\pyaf\TS\SignalDecomposition.py", line 641, in train
      File "pathtomyfolder\.venv\lib\site-packages\pyaf\ForecastEngine.py", line 25, in train
    
    INFO:pyaf.std:START_TRAINING 'Signal'
        self.mSignalDecomposition.train(iInputDS, iTime, iSignal, iHorizon, iExogenousData);
      File "pathtomyfolder\.venv\lib\site-packages\pyaf\TS\SignalDecomposition.py", line 641, in train
        lTrainer.train(iInputDS, iTime, iSignal, iHorizon)
      File "pathtomyfolder\.venv\lib\site-packages\pyaf\TS\SignalDecomposition.py", line 322, in train
        self.train_multiprocessed(iInputDS, iTime, iSignal, iHorizon);
      File "pathtomyfolder\.venv\lib\site-packages\pyaf\TS\SignalDecomposition.py", line 355, in train_multiprocessed
    Traceback (most recent call last):
      File "pathtomyfolder\.venv\lib\site-packages\pyaf\ForecastEngine.py", line 25, in train
        self.train_multiprocessed(iInputDS, iTime, iSignal, iHorizon);
      File "pathtomyfolder\.venv\lib\site-packages\pyaf\TS\SignalDecomposition.py", line 355, in train_multiprocessed
        pool = Pool(self.mOptions.mNbCores)
      File "C:\Program Files (x86)\Microsoft Visual Studio\Shared\Python37_64\lib\multiprocessing\context.py", line 119, in Pool
    Traceback (most recent call last):
      File "pathtomyfolder\.venv\lib\site-packages\pyaf\ForecastEngine.py", line 25, in train
        pool = Pool(self.mOptions.mNbCores)
      File "C:\Program Files (x86)\Microsoft Visual Studio\Shared\Python37_64\lib\multiprocessing\context.py", line 119, in Pool
        context=self.get_context())
      File "C:\Program Files (x86)\Microsoft Visual Studio\Shared\Python37_64\lib\multiprocessing\pool.py", line 176, in __init__
        self._repopulate_pool()
      File "C:\Program Files (x86)\Microsoft Visual Studio\Shared\Python37_64\lib\multiprocessing\pool.py", line 241, in _repopulate_pool
        self.mSignalDecomposition.train(iInputDS, iTime, iSignal, iHorizon, iExogenousData);
      File "pathtomyfolder\.venv\lib\site-packages\pyaf\TS\SignalDecomposition.py", line 641, in train
        lTrainer.train(iInputDS, iTime, iSignal, iHorizon)
      File "pathtomyfolder\.venv\lib\site-packages\pyaf\TS\SignalDecomposition.py", line 322, in train
        self.mSignalDecomposition.train(iInputDS, iTime, iSignal, iHorizon, iExogenousData);
      File "pathtomyfolder\.venv\lib\site-packages\pyaf\TS\SignalDecomposition.py", line 641, in train
        w.start()
      File "C:\Program Files (x86)\Microsoft Visual Studio\Shared\Python37_64\lib\multiprocessing\process.py", line 112, in start
        context=self.get_context())
      File "C:\Program Files (x86)\Microsoft Visual Studio\Shared\Python37_64\lib\multiprocessing\pool.py", line 176, in __init__
        self.train_multiprocessed(iInputDS, iTime, iSignal, iHorizon);
      File "pathtomyfolder\.venv\lib\site-packages\pyaf\TS\SignalDecomposition.py", line 355, in train_multiprocessed
        lTrainer.train(iInputDS, iTime, iSignal, iHorizon)
      File "pathtomyfolder\.venv\lib\site-packages\pyaf\TS\SignalDecomposition.py", line 322, in train
        self._popen = self._Popen(self)
      File "C:\Program Files (x86)\Microsoft Visual Studio\Shared\Python37_64\lib\multiprocessing\context.py", line 322, in _Popen
        self._repopulate_pool()
      File "C:\Program Files (x86)\Microsoft Visual Studio\Shared\Python37_64\lib\multiprocessing\pool.py", line 241, in _repopulate_pool
        w.start()
      File "C:\Program Files (x86)\Microsoft Visual Studio\Shared\Python37_64\lib\multiprocessing\process.py", line 112, in start
        self.train_multiprocessed(iInputDS, iTime, iSignal, iHorizon);
      File "pathtomyfolder\.venv\lib\site-packages\pyaf\TS\SignalDecomposition.py", line 355, in train_multiprocessed
        return Popen(process_obj)
      File "C:\Program Files (x86)\Microsoft Visual Studio\Shared\Python37_64\lib\multiprocessing\popen_spawn_win32.py", line 46, in __init__
              File "pathtomyfolder\.venv\lib\site-packages\pyaf\TS\SignalDecomposition.py", line 641, in train
        lTrainer.train(iInputDS, iTime, iSignal, iHorizon)
      File "pathtomyfolder\.venv\lib\site-packages\pyaf\TS\SignalDecomposition.py", line 322, in train
    Traceback (most recent call last):
    prep_data = spawn.get_preparation_data(process_obj._name)    pool = Pool(self.mOptions.mNbCores)
      File "C:\Program Files (x86)\Microsoft Visual Studio\Shared\Python37_64\lib\multiprocessing\context.py", line 119, in Pool
        self.train_multiprocessed(iInputDS, iTime, iSignal, iHorizon);
      File "pathtomyfolder\.venv\lib\site-packages\pyaf\TS\SignalDecomposition.py", line 355, in train_multiprocessed
        lTrainer.train(iInputDS, iTime, iSignal, iHorizon)
      File "pathtomyfolder\.venv\lib\site-packages\pyaf\TS\SignalDecomposition.py", line 322, in train
        context=self.get_context())
      File "pathtomyfolder\.venv\lib\site-packages\pyaf\ForecastEngine.py", line 25, in train
    
        pool = Pool(self.mOptions.mNbCores)    self.train_multiprocessed(iInputDS, iTime, iSignal, iHorizon);
      File "pathtomyfolder\.venv\lib\site-packages\pyaf\TS\SignalDecomposition.py", line 355, in train_multiprocessed
    
      File "C:\Program Files (x86)\Microsoft Visual Studio\Shared\Python37_64\lib\multiprocessing\context.py", line 119, in Pool
        context=self.get_context())
      File "C:\Program Files (x86)\Microsoft Visual Studio\Shared\Python37_64\lib\multiprocessing\pool.py", line 176, in __init__
              File "C:\Program Files (x86)\Microsoft Visual Studio\Shared\Python37_64\lib\multiprocessing\pool.py", line 176, in __init__
    pool = Pool(self.mOptions.mNbCores)
      File "C:\Program Files (x86)\Microsoft Visual Studio\Shared\Python37_64\lib\multiprocessing\context.py", line 119, in Pool
    pool = Pool(self.mOptions.mNbCores)
    self._repopulate_pool()        self._popen = self._Popen(self)  File "C:\Program Files (x86)\Microsoft Visual Studio\Shared\Python37_64\lib\multiprocessing\spawn.py", 
    line 143, in get_preparation_data
        context=self.get_context())
      File "C:\Program Files (x86)\Microsoft Visual Studio\Shared\Python37_64\lib\multiprocessing\pool.py", line 176, in __init__
        self._repopulate_pool()
      File "C:\Program Files (x86)\Microsoft Visual Studio\Shared\Python37_64\lib\multiprocessing\pool.py", line 241, in _repopulate_pool
    
      File "C:\Program Files (x86)\Microsoft Visual Studio\Shared\Python37_64\lib\multiprocessing\context.py", line 322, in _Popen
    
      File "C:\Program Files (x86)\Microsoft Visual Studio\Shared\Python37_64\lib\multiprocessing\pool.py", line 241, in _repopulate_pool
      File "C:\Program Files (x86)\Microsoft Visual Studio\Shared\Python37_64\lib\multiprocessing\context.py", line 119, in Pool
    self._repopulate_pool()
      File "C:\Program Files (x86)\Microsoft Visual Studio\Shared\Python37_64\lib\multiprocessing\pool.py", line 241, in _repopulate_pool
        _check_not_importing_main()            self.mSignalDecomposition.train(iInputDS, iTime, iSignal, iHorizon, iExogenousData);
      File "pathtomyfolder\.venv\lib\site-packages\pyaf\TS\SignalDecomposition.py", line 641, in train
    
      File "C:\Program Files (x86)\Microsoft Visual Studio\Shared\Python37_64\lib\multiprocessing\spawn.py", line 136, in _check_not_importing_main
    context=self.get_context())
      File "C:\Program Files (x86)\Microsoft Visual Studio\Shared\Python37_64\lib\multiprocessing\pool.py", line 176, in __init__
    w.start()
      File "C:\Program Files (x86)\Microsoft Visual Studio\Shared\Python37_64\lib\multiprocessing\process.py", line 112, in start
    w.start()
      File "C:\Program Files (x86)\Microsoft Visual Studio\Shared\Python37_64\lib\multiprocessing\process.py", line 112, in start
        lTrainer.train(iInputDS, iTime, iSignal, iHorizon)
      File "pathtomyfolder\.venv\lib\site-packages\pyaf\TS\SignalDecomposition.py", line 322, in train
        self._popen = self._Popen(self)
    w.start()    self._popen = self._Popen(self)
      File "C:\Program Files (x86)\Microsoft Visual Studio\Shared\Python37_64\lib\multiprocessing\context.py", line 322, in _Popen
        self.train_multiprocessed(iInputDS, iTime, iSignal, iHorizon);
      File "pathtomyfolder\.venv\lib\site-packages\pyaf\TS\SignalDecomposition.py", line 355, in train_multiprocessed
        return Popen(process_obj)
      File "C:\Program Files (x86)\Microsoft Visual Studio\Shared\Python37_64\lib\multiprocessing\popen_spawn_win32.py", line 46, in __init__
            is not going to be frozen to produce an executable.''')
    RuntimeError:
            An attempt has been made to start a new process before the
            current process has finished its bootstrapping phase.
    
            This probably means that you are not using fork to start your
            child processes and you have forgotten to use the proper idiom
            in the main module:
    
                if __name__ == '__main__':
                    freeze_support()
                    ...
    
            The "freeze_support()" line can be omitted if the program
            is not going to be frozen to produce an executable.
    
    class:bug priority:high topic:generic status:in_progress 
    opened by sachingooo 9
  • Add a Pypi version

    Add a Pypi version

    While issue reports are always welcome, and you are free to use any form to submit these, the following points are to be considered for an easier processing and more productivity:

    1. The issue must be a bug or a feature request.
    2. A description is needed as source code and/or a link to a dataset for which the problem arises (please simplify the code, anonymize the dataset etc).
    3. information on different software versions used (pyaf, numpy, pandas, scikit-learn etc). The output of the following script should be enough : https://github.com/antoinecarme/pyaf/blob/master/tests/basic_checks/platform_info.py
    opened by firmai 9
  • Automate Prototyping Activities - R-based Models

    Automate Prototyping Activities - R-based Models

    It is useful to have a git branch which contains all the necessary toolkit for prototyping.

    Make it possible to use R/forecast from inside pyaf. "Fake" pyaf models which call R to validate a specific implementation.

    This branch is not to be merged.

    First application : Threshold AR models #214 and TSMARS models #215

    priority:high topic:modeling_quality class:devops status:in_progress topic:Green 
    opened by antoinecarme 6
  • Investigate TSMARS Models

    Investigate TSMARS Models

    TSMars is an application of MARS regression (Multivariate adaptive regression spline) models to Time Seris Forecasting.

    https://en.wikipedia.org/wiki/Multivariate_adaptive_regression_spline

    opened by antoinecarme 7
  • Investigate Threshold AR Models

    Investigate Threshold AR Models

    Introduce some non-linear time series models. Two-regime threshold AR (TAR) models are good candidates.

    Original Paper :

    Howell Tong Department of Statistics, The Chinese University of Hong Kong, Shatin, NT, Hong Kong

    Tong, H. (1983) Threshold Models in Nonlinear Time Series Analysis. Lecture Notes in Statistics, Springer-Verlag

    https://link.springer.com/book/10.1007/978-1-4684-7888-4 https://link.springer.com/chapter/10.1007/978-1-4684-7888-4_3

    A good reference is given by :

    image

    Nonlinear Time Series Analysis

    Author(s):Ruey S. Tay,Rong Chen

    https://onlinelibrary.wiley.com/doi/book/10.1002/9781119514312

    opened by antoinecarme 7
  • Large Horizon Models

    Large Horizon Models

    Large Horizon Models (H large enough). Profiling for CPU/memory/speed.

    Compute Prediction intervals for all tested models. Use more sophistical forecast perf combination in model selection (mean ? max ?). decreasing time based weights ? Take into account the shape of the prediction interval (esthetic for model precision).

    opened by antoinecarme 1
  • Investigate Model Esthetics for PyAF

    Investigate Model Esthetics for PyAF

    SOTA : Investigate the existing definition attempts of what are the most expected features of a "nice" model.

    Esthetics != explainability but the intersection is not empty.

    Esthetics != Simplicity but the intersection is not empty.

    Exercise : Take too very close models and try to get make a "pitch" for each one (mine is better ;)

    Not sure this will lead to some product/value. A nice have.

    opened by antoinecarme 1
  • Use PyTorch as the reference deep learning architecture for future projects

    Use PyTorch as the reference deep learning architecture for future projects

    PyAF will use PyTorch as its deep learning architecture for future projects. A few reasons for this :

    1. Pytorch is fully open source. Green (#176 )
    2. PyTorch internal/technical choices are very sane. It works even in very hash environments : SPARC64 architecture.
    3. SPARC64 architecture : abandoned years ago, no commercial support, very strong technically ( manycore, > 128 threads), with approximate OS (Debian rocks here ;), was able to build a set of packages for PyTorch from scratch : https://github.com/antoinecarme/sparc-t3-data/tree/master/debian-sparc64/packages
    4. PyAF runs OK with PyTorch on SPARC64 and uses all the 128 threads for some complex hierarchical forecasting models.
    class:enhancement priority:high topic:modeling_quality topic:neural_net status:in_progress 
    opened by antoinecarme 1
Releases(4.0)
  • 4.0(Jul 13, 2022)

    RELEASE 4.0 ( 2022-07-14 )

    1. Python 3.10 support #186
    2. Add Multiplicative Models/Seasonals #178
    3. Speed Performance Improvements : #190 , #191
    4. Exogenous data support improvements : #193, #197, #198
    5. PyAF support for ARM64 Architecture #187
    6. PyTorch support : #199
    7. Improved Logging : #185
    8. Bug Fixes : #156, #179, #182, #184
    9. Release Process : Pre-release Benchmarks #194
    10. Release Process : Profiling and Warning Hunts #195
    11. Release Process : Review Existing Docs #196, #35
    Source code(tar.gz)
    Source code(zip)
  • 3.0(Jul 13, 2021)

    RELEASE 3.0 ( 2021-07-14 )

    1. Python 3.9 support #149
    2. Probabilistic Forecasting : Forecast quantiles (#140), CRPS (#74), Plots and Docs (#158).
    3. Add LightGBM based models #143
    4. Add more Performance Measures : MedAE (#144) , LnQ ( #43 )
    5. PyAF Powerpc support (IBM S822xx) #160
    6. More Parallelization Efforts (#145)
    7. Add Missing Data Imputation Methods (#146 )
    8. Improved long signals modeling (#167)
    9. Warning Hunts (#153)
    10. Some Bug Fixes (#163, #142, #168).
    11. Switched to Circle-CI (#164)
    12. Plot Functions Improvement #169
    13. Model Complexity Improvement (#171)
    14. Documentation review/corrections (#174)
    Source code(tar.gz)
    Source code(zip)
  • 2.0(Jul 14, 2020)

    RELEASE 2.0 (2020-07-14)

    1. Time column is normalized frequently leading to a performance issue. Profiling. Significant speedup. Issue #121
    2. Corrected PyPi packaging. Issue #123
    3. Allow using exogenous data in hierarchical forecasting models. Issue #124
    4. Properly handle very large signals. Add Sampling. Issue #126
    5. Add temporal hierarchical forecasting. Issue #127
    6. Analyze Business Seasonals (HourOfWeek and derivatives) . Issue #131
    7. Improved logs (More model details). Issue #133, #134, #135
    8. More robust cycles (use target median instead of target mean encoding). Issue #132
    9. Analyze Business Seasonals (WeekOfMonth and derivatives). Issue #137
    10. Improved JSON output (added Model Options). Issue #136
    11. Improved CPU usage (parallelization) for hierarchical models. Issue #115
    12. Speedups in multiple places : forecasts generation, plotting, AR Modelling (feature selection).
    Source code(tar.gz)
    Source code(zip)
  • 1.2.4(Apr 5, 2020)

  • 1.2.3(Apr 5, 2020)

  • 1.2.2(Apr 4, 2020)

    PyAF now has a pypi installer. You can now use :

    pip install pyaf

    to install it.

    Addiitonal tweaks ... double check PyPI / twine / demo scripts.

    Source code(tar.gz)
    Source code(zip)
  • 1,1(Jun 30, 2019)

  • 1.0-RC1(Apr 27, 2017)

  • v0.3-alpha(Dec 26, 2016)

  • v0.2-alpha(Dec 8, 2016)

  • v0.1-alpha(Nov 23, 2016)

Owner
CARME Antoine
CARME Antoine
[ACL 2022] LinkBERT: A Knowledgeable Language Model 😎 Pretrained with Document Links

LinkBERT: A Knowledgeable Language Model Pretrained with Document Links This repo provides the model, code & data of our paper: LinkBERT: Pretraining

Michihiro Yasunaga 264 Jan 01, 2023
Repository for the paper "Online Domain Adaptation for Occupancy Mapping", RSS 2020

RSS 2020 - Online Domain Adaptation for Occupancy Mapping Repository for the paper "Online Domain Adaptation for Occupancy Mapping", Robotics: Science

Anthony 26 Sep 22, 2022
A script that trains a model to recognize handwritten digits using the MNIST data set.

handwritten-digits-recognition A script that trains a model to recognize handwritten digits using the MNIST data set. Then it loads external files and

Hamza Sayih 1 Oct 30, 2021
Code for the paper "Jukebox: A Generative Model for Music"

Status: Archive (code is provided as-is, no updates expected) Jukebox Code for "Jukebox: A Generative Model for Music" Paper Blog Explorer Colab Insta

OpenAI 6k Jan 02, 2023
Source code of "Hold me tight! Influence of discriminative features on deep network boundaries"

Hold me tight! Influence of discriminative features on deep network boundaries This is the source code to reproduce the experiments of the NeurIPS 202

EPFL LTS4 19 Dec 10, 2021
An implementation for `Text2Event: Controllable Sequence-to-Structure Generation for End-to-end Event Extraction`

Text2Event An implementation for Text2Event: Controllable Sequence-to-Structure Generation for End-to-end Event Extraction Please contact Yaojie Lu (@

Roger 153 Jan 07, 2023
Full-featured Decision Trees and Random Forests learner.

CID3 This is a full-featured Decision Trees and Random Forests learner. It can save trees or forests to disk for later use. It is possible to query tr

Alejandro Penate-Diaz 3 Aug 15, 2022
Blender scripts for computing geodesic distance

GeoDoodle Geodesic distance computation for Blender meshes Table of Contents Overivew Usage Implementation Overview This addon provides an operator fo

20 Jun 08, 2022
RIM: Reliable Influence-based Active Learning on Graphs.

RIM: Reliable Influence-based Active Learning on Graphs. This repository is the official implementation of RIM. Requirements To install requirements:

Wentao Zhang 4 Aug 29, 2022
MogFace: Towards a Deeper Appreciation on Face Detection

MogFace: Towards a Deeper Appreciation on Face Detection Introduction In this repo, we propose a promising face detector, termed as MogFace. Our MogFa

48 Dec 20, 2022
f-BRS: Rethinking Backpropagating Refinement for Interactive Segmentation

f-BRS: Rethinking Backpropagating Refinement for Interactive Segmentation [Paper] [PyTorch] [MXNet] [Video] This repository provides code for training

Visual Understanding Lab @ Samsung AI Center Moscow 516 Dec 21, 2022
Hydra: an Extensible Fuzzing Framework for Finding Semantic Bugs in File Systems

Hydra: An Extensible Fuzzing Framework for Finding Semantic Bugs in File Systems Paper Finding Semantic Bugs in File Systems with an Extensible Fuzzin

gts3.org (<a href=[email protected])"> 129 Dec 15, 2022
ESL: Event-based Structured Light

ESL: Event-based Structured Light Video (click on the image) This is the code for the 2021 3DV paper ESL: Event-based Structured Light by Manasi Mugli

Robotics and Perception Group 29 Oct 24, 2022
Implementation of ProteinBERT in Pytorch

ProteinBERT - Pytorch (wip) Implementation of ProteinBERT in Pytorch. Original Repository Install $ pip install protein-bert-pytorch Usage import torc

Phil Wang 92 Dec 25, 2022
Unofficial implementation (replicates paper results!) of MINER: Multiscale Implicit Neural Representations in pytorch-lightning

MINER_pl Unofficial implementation of MINER: Multiscale Implicit Neural Representations in pytorch-lightning. 📖 Ref readings Laplacian pyramid explan

AI葵 51 Nov 28, 2022
Code for CVPR 2021 paper: Anchor-Free Person Search

Introduction This is the implementationn for Anchor-Free Person Search in CVPR2021 License This project is released under the Apache 2.0 license. Inst

158 Jan 04, 2023
A collection of easy-to-use, ready-to-use, interesting deep neural network models

Interesting and reproducible research works should be conserved. This repository wraps a collection of deep neural network models into a simple and un

Aria Ghora Prabono 16 Jun 16, 2022
Official implementation of NLOS-OT: Passive Non-Line-of-Sight Imaging Using Optimal Transport (IEEE TIP, accepted)

NLOS-OT Official implementation of NLOS-OT: Passive Non-Line-of-Sight Imaging Using Optimal Transport (IEEE TIP, accepted) Description In this reposit

Ruixu Geng(耿瑞旭) 16 Dec 16, 2022
[NeurIPS 2021] Code for Unsupervised Learning of Compositional Energy Concepts

Unsupervised Learning of Compositional Energy Concepts This is the pytorch code for the paper Unsupervised Learning of Compositional Energy Concepts.

45 Nov 30, 2022