Gaussian Process Optimization using GPy

Overview

End of maintenance for GPyOpt

Dear GPyOpt community!

We would like to acknowledge the obvious. The core team of GPyOpt has moved on, and over the past months we weren't giving the package nearly as much attention as it deserves. Instead of dragging our feet and giving people only occasional replies and no new features, we feel the time has come to officially declare the end of GPyOpt maintenance.

We would like to thank the community that has formed around GPyOpt. Without your interest, discussions, bug fixes and pull requests the package would never be as successful as it is. We hope we were able to provide you with a useful tool to aid your research and work.

From now on we won't be participating in the issues, merging PRs or developing any new functions. All existing PRs will be closed (but not the issues). The repo itself is not closing though, so feel free to start new discussion threads and forks. We are also still around, so may drop an occasional comment here or there. But no promises.

Finally, if you feel really enthusiastic and would like to take over the package, feel free to drop both of us an email, and who knows, maybe you'll be the one(s) carrying the GPyOpt to new heights!

Sincerely yours, Andrei Paleyes and Javier Gonzalez

GPyOpt

Gaussian process optimization using GPy. Performs global optimization with different acquisition functions. Among other functionalities, it is possible to use GPyOpt to optimize physical experiments (sequentially or in batches) and tune the parameters of Machine Learning algorithms. It is able to handle large data sets via sparse Gaussian process models.

licence develstat covdevel Research software impact

Citation

@Misc{gpyopt2016,
author = {The GPyOpt authors},
title = {{GPyOpt}: A Bayesian Optimization framework in python},
howpublished = {\url{http://github.com/SheffieldML/GPyOpt}},
year = {2016}
}

Getting started

Installing with pip

The simplest way to install GPyOpt is using pip. ubuntu users can do:

sudo apt-get install python-pip
pip install gpyopt

If you'd like to install from source, or want to contribute to the project (e.g. by sending pull requests via github), read on. Clone the repository in GitHub and add it to your $PYTHONPATH.

git clone https://github.com/SheffieldML/GPyOpt.git
cd GPyOpt
python setup.py develop

Dependencies:

  • GPy
  • paramz
  • numpy
  • scipy
  • matplotlib
  • DIRECT (optional)
  • cma (optional)
  • pyDOE (optional)
  • sobol_seq (optional)

You can install dependencies by running:

pip install -r requirements.txt

Funding Acknowledgements

  • BBSRC Project No BB/K011197/1 "Linking recombinant gene sequence to protein product manufacturability using CHO cell genomic resources"

  • See GPy funding Acknowledgements

Comments
  • pre-computed search space

    pre-computed search space

    I forked this project and did some modifications to pre-computed search space for the case of discrete variables with constraints. What is the process to get green light for this changes to be merged to master (i.e. pull request approval process)?

    opened by pavel-rev 16
  • different results vs. runs

    different results vs. runs

    I run optimization and get different results. Sometimes it hits the optimum (I have independent "slow" exhaustive search to know the optimum). Sometimes it does not hit it. Are there known rules of thumb wrt parameters to get more consistent results?

    opened by pavel-rev 13
  • No module named task.cost

    No module named task.cost

    Hi everyone! I still have the problem to use the new version of GPyOpt with the following error:

    import GPyOpt Traceback (most recent call last): File "", line 1, in File "build/bdist.macosx-10.5-x86_64/egg/GPyOpt/init.py", line 4, in

    File "build/bdist.macosx-10.5-x86_64/egg/GPyOpt/core/init.py", line 4, in

    File "build/bdist.macosx-10.5-x86_64/egg/GPyOpt/core/bo.py", line 7, in ImportError: No module named task.cost

    Any idea to resolve this problem? THANKS

    opened by calm85 13
  • ImportError: No module named 'core'

    ImportError: No module named 'core'

    I tried to run the first example from the manual: http://nbviewer.ipython.org/github/SheffieldML/GPyOpt/blob/master/manual/GPyOpt_reference_manual.ipynb

    The line:

    import GPyOpt
    

    fails with the following error:

    ImportError: No module named 'core'
    

    Happens on Python 3.4 / Windows 8 x64.

    I think the problem is Python 3.x, but I cannot use another version because of other packages... any way to fix this?

    opened by stmax82 12
  • pip install issue

    pip install issue

    ± pip install gpy gpyopt --user --upgrade
    Collecting gpy
      Downloading GPy-1.8.5.tar.gz (856kB)
        100% |████████████████████████████████| 860kB 1.3MB/s 
    Collecting gpyopt
      Using cached GPyOpt-1.2.1.tar.gz
        Complete output from command python setup.py egg_info:
        Traceback (most recent call last):
          File "<string>", line 1, in <module>
          File "/tmp/ahundt/pip-build-fcsazi50/gpyopt/setup.py", line 6, in <module>
            from GPyOpt.__version__ import __version__
          File "/tmp/ahundt/pip-build-fcsazi50/gpyopt/GPyOpt/__init__.py", line 7, in <module>
            from GPyOpt.core.task.space import Design_space
          File "/tmp/ahundt/pip-build-fcsazi50/gpyopt/GPyOpt/core/__init__.py", line 4, in <module>
            from .bo import BO
          File "/tmp/ahundt/pip-build-fcsazi50/gpyopt/GPyOpt/core/bo.py", line 9, in <module>
            from ..util.duplicate_manager import DuplicateManager
          File "/tmp/ahundt/pip-build-fcsazi50/gpyopt/GPyOpt/util/duplicate_manager.py", line 5, in <module>
            from ..core.task.space import Design_space
          File "/tmp/ahundt/pip-build-fcsazi50/gpyopt/GPyOpt/core/task/__init__.py", line 4, in <module>
            from .objective import SingleObjective
          File "/tmp/ahundt/pip-build-fcsazi50/gpyopt/GPyOpt/core/task/objective.py", line 8, in <module>
            import GPy
        ImportError: No module named 'GPy'
        
        ----------------------------------------
    Command "python setup.py egg_info" failed with error code 1 in /tmp/ahundt/pip-build-fcsazi50/gpyopt/
    
    
    opened by ahundt 10
  • Optimization chooses and gets stuck at an infeasible point

    Optimization chooses and gets stuck at an infeasible point

    Hi GPyOpt developers. I have an issue where GPyOpt chooses an infeasible next point when the number of variables in my problem exceeds 8 and then immediately converges to a suboptimal infeasible point (with respect to the inequality constraints, the chosen point is still within the domain). What I am doing is optimizing multiple l2-norm regularization parameters, an n dimensional vector denoted x, where the unknown function f(x) is the solution to a regularized logistic regression with respect to some weights, w, under the inequality constraints x_1 <= x_2 <= x_3 <= ... <= x_n. The purpose of these constraints is to reduce the computation time since any permutation of the elements of x will result in the same value of f(x). Here is an example of how I have the bounds and constraints set up for an n = 5 problem before being passed into BayesianOptimization():

    domain = [{'domain': (0.0, 1.0), 'type': 'continuous', 'name': x_1_b'},
                      {'domain': (0.0, 1.0), 'type': 'continuous', 'name': 'x_2_b'},
                      {'domain': (0.0, 1.0), 'type': 'continuous', 'name': 'x_3_b'},
                      {'domain': (0.0, 1.0), 'type': 'continuous', 'name': 'x_4_b'},
                      {'domain': (0.0, 1.0), 'type': 'continuous', 'name': 'x_5_b'}]
    
    constraints = [{'constrain': 'x[:,0] - x[:,1]', 'name': 'x_1_2_c'},
                           {'constrain': 'x[:,1] - x[:,2]', 'name': 'x_2_3_c'},
                           {'constrain': 'x[:,2] - x[:,3]', 'name': 'x_3_4_c'},
                           {'constrain': 'x[:,3] - x[:,4]', 'name': 'x_4_5_c'}]
    

    Other information: I am using the matern32 kernel (this phenomenon occurs with matern52 as well) with the maximum likelihood update to the Gaussian process hyperparameters and the lower confidence bound acquisition function (default values except jitter is 1E-4 to protect against singularity of the kernel). The optimization has worked fine with these constraints when n < 9 and also worked for n > 8 when the constraints were removed.

    opened by jkaardal 10
  • plot_acquisition() plots the wrong y-values

    plot_acquisition() plots the wrong y-values

    Hi,

    I have a fairly complex function that should always return a number >=0. Print statements in this function indicate this is satisfied. Running opt.fx_opt gives a value close to 0, which should be the true minimum. However, opt.plot_acquisition shows negative numbers at the minimum. This shouldn't be an artifact of the underlying gaussian process model, as even the points in red where GPyOpt evaluates the function is negative. Why is this the case?

    acquisition

    opened by tawe141 8
  • Using physical data not a

    Using physical data not a "known" function

    Hello, I'm new to python and gaussian processes, so I need a little help/insight. I am trying to use GPyOpt with physical x, y data and not a "known" function. The examples all use a known function. I am reading in a .csv file and I want to use bayesian optimization with the test data to let me know what experiment to run next (obtain next x,y data set). Please any advice and help is appreciated. Thank you.

    opened by will-colea 8
  • Fix bad comparison against None using ==

    Fix bad comparison against None using ==

    While running GPyOpt on python3 I started to see the following failure:

    Traceback (most recent call last):
      File "/Users/mattlavin/Projects/GPyOpt/GPyOpt/testing/test_parallelization.py", line 102, in test_run
        unittest_result = run_eval(problem_config= self.problem_config, f_inits= self.f_inits, method_config=m_c, name=name, outpath=self.outpath, time_limit=None, unittest = self.is_unittest)
      File "/Users/mattlavin/Projects/GPyOpt/GPyOpt/testing/driver.py", line 47, in run_eval
        verbosity       = m_c['verbosity'])
      File "/Users/mattlavin/Projects/GPyOpt/GPyOpt/methods/bayesian_optimization.py", line 458, in run_optimization
        super(BayesianOptimization, self).run_optimization(max_iter = max_iter, max_time = max_time,  eps = eps, verbosity=verbosity, save_models_parameters = save_models_parameters, report_file = report_file, evaluations_file= evaluations_file, models_file=models_file)
      File "/Users/mattlavin/Projects/GPyOpt/GPyOpt/core/bo.py", line 103, in run_optimization
        self._update_model()
      File "/Users/mattlavin/Projects/GPyOpt/GPyOpt/core/bo.py", line 198, in _update_model
        self._save_model_parameter_values()
      File "/Users/mattlavin/Projects/GPyOpt/GPyOpt/core/bo.py", line 209, in _save_model_parameter_values
        if self.model_parameters_iterations == None:
    ValueError: The truth value of an array with more than one element is ambiguous. Use a.any() or a.all()
    

    It turns out that an array was being compared against None with == instead of is switching to is avoided the ambiguous type coercion.

    opened by mdlavin 8
  • Matern kernel

    Matern kernel

    I understand that "kernel" will become deprecated in the new version. How can one specify the Gpy Matern ARD 5/2 kernel with the newest API?

    GPyOpt.methods.bayesian_optimization.BayesianOptimization(f, domain=None, constrains=None, cost_withGradients=None, model_type='GP', X=None, Y=None, initial_design_numdata=None, initial_design_type='random', acquisition_type='EI', normalize_Y=True, exact_feval=False, acquisition_optimizer_type='lbfgs', model_update_interval=1, evaluator_type='sequential', batch_size=1, num_cores=1, verbosity=True, verbosity_model=False, bounds=None, **kwargs)

    opened by ghost 8
  • Bayesian updating of hyperparameters

    Bayesian updating of hyperparameters

    Dear Developers, We were able to customise our kernel and BO.model, and used it with run_optimisation. The BO procedure works well, but we noticed that if we update our model with each acquisition, this also updates the hyperparameters of the GP process (maximum likelihood) and often results in overfitting. Increasing model update interval was not a good idea either. We cannot find how to a) impose upper/lower numerical bounds on the hyperparameter search and b) refit the model with each acquisitions but update the model hyperparameters more slowly. Ideally, we would use Bayesian updating of hyperparameters (with priors on the hyperparameters), is this available? This is currently the biggest issue for us. Many thanks for your assistance, Milica.

    question 
    opened by milicasan 8
  • New Acquisition function

    New Acquisition function

    I am trying to write a new acquisition function using the GpyOpt package as provided in https://nbviewer.org/github/SheffieldML/GPyOpt/blob/master/manual/GPyOpt_creating_new_aquisitions.ipynb. After completing the writing of my acquisition function, I am unable to run Bayesian optimization loop using my new acquisition function. Please give me some suggestions regarding how to run my new acquisition function.

    opened by craju06 0
  • How to check lengthscales when ARD=True.

    How to check lengthscales when ARD=True.

    I do Bayesian optimisation in two dimensions. When 'ARD' is set to 'True', printing kernel outputs '(2, )'. I would like to check the respective lengthscale in 2D, is there a way to do this?

    opened by ktakihara2000 2
  • Optimization terminates earlier than max_iter

    Optimization terminates earlier than max_iter

    I often see that optimization terminates earlier than max_iter. I set the option of 'de_duplication' to "False" and eps to zero (or a negative value), so I do not think the consecutive occurrence of the same values of x is the reason.

    I would appreciate it if someone could tell me what is causing this problem and how to solve it.

    opened by takagi-ya 1
  • constrained bayesian optimization

    constrained bayesian optimization

    Hi,

    Is it possible to model constrained via Gaussian process? (especially implicit constrained, like max tracking error, which is not the constrained of the hyperparameter) Then output the best parameter based on cost and constraint?

    Thanks, and have a nice day!

    opened by merdan-9 0
  • Cannot use the batch_size>1 with local penalization evaluator

    Cannot use the batch_size>1 with local penalization evaluator

    Hey all,

    According to the issues I read in this repository, the batch_size>1 only works with the local_penalization evaluator, and not with sequential or any other evaluator model.

    I have set up a Bayesian Optimization model where I use a GP surrogate model m trained on my dataset as my objective function, as follows:

    def obj_func(X):
        out,_ = m.predict(X)
        return(out)
    
    bo_step = GPyOpt.methods.BayesianOptimization(f = obj_func, domain = bounds,
                                                        model_type='GP',normalize_Y = False,
                                                        evaluator_type = 'local_penalization',
                                                        acquisition_type='EI',batch_size=5, maximize=True, eps=1e-8)
    

    when I run the optimization, I only get a single prediction instead of batch_size=5:

    bo_step.run_optimization(max_iter=5)

    If I use the external object evaluation example as a starting point and use bo_step.suggest_next_locations() functions, I get 5 suggestions, but it seems that it does not really maximize my objective function (below). However I am not sure if I can/should use this object since I already have a surrogate model function fit into my dataset.

    x_next = bo_step.suggest_next_locations()

    Any help or suggestion on this is highly appreciated.

    Best,

    Bulut

    opened by blttkgl 1
  • Can the optimizer stop when best_y is smaller than a specific value?

    Can the optimizer stop when best_y is smaller than a specific value?

    Hi, thanks for your contributions.

    I notice that the optimizer will finish when the iteration number is over than 'max_iter', or the time is over 'max_time'. But I wonder that can I set a specific number, if the best_y in the optimization is smaller than this value, it means that a optimal result has been found, then BO can stop searching.

    Hoping for your reply, thanks!

    opened by Seal-o-O 0
Releases(v1.2.6)
  • v1.2.6(Mar 19, 2020)

    • Small fix in description.
    • coloring acquisition plot by the step of the objective evaluation.
    • Issue #94: Fix constraint violation in anchor generation.
    • Added the ability to set the mean function.
    • Added x and y labels for plotted graphs.
    • Correct typo in RFModel docstring.
    • Pull request: fix branch for Issue-244.
    • Implementing Probability of Feasibility (PoF) constraint handling.
    • Fixed obvious errors in CMA and DIRECT optimizers, added unit tests.
    • Caught mismatch between code and spec: constraints.
    • Fix a broken link in the web page
    • Fix a broken link in the footer.
    • Allow users to choose lhs sampling criteria
    • correct square root in lower confidence bound acquisition
    • Remove unused and add required imports
    • Typo in jupyter command
    • Avoid cost function ignored warning with LCB acquisition
    Source code(tar.gz)
    Source code(zip)
Owner
Sheffield Machine Learning Software
Software from the Sheffield machine learning group and collaborators.
Sheffield Machine Learning Software
Model search (MS) is a framework that implements AutoML algorithms for model architecture search at scale.

Model Search Model search (MS) is a framework that implements AutoML algorithms for model architecture search at scale. It aims to help researchers sp

AriesTriputranto 1 Dec 13, 2021
30 Days Of Machine Learning Using Pytorch

Objective of the repository is to learn and build machine learning models using Pytorch. 30DaysofML Using Pytorch

Mayur 119 Nov 24, 2022
Test symmetries with sklearn decision tree models

Test symmetries with sklearn decision tree models Setup Begin from an environment with a recent version of python 3. source setup.sh Leave the enviro

Rupert Tombs 2 Jul 19, 2022
Fast Fourier Transform-accelerated Interpolation-based t-SNE (FIt-SNE)

FFT-accelerated Interpolation-based t-SNE (FIt-SNE) Introduction t-Stochastic Neighborhood Embedding (t-SNE) is a highly successful method for dimensi

Kluger Lab 547 Dec 21, 2022
The Emergence of Individuality

The Emergence of Individuality

16 Jul 20, 2022
PennyLane is a cross-platform Python library for differentiable programming of quantum computers

PennyLane is a cross-platform Python library for differentiable programming of quantum computers. Train a quantum computer the same way as a neural ne

PennyLaneAI 1.6k Jan 01, 2023
Decentralized deep learning in PyTorch. Built to train models on thousands of volunteers across the world.

Hivemind: decentralized deep learning in PyTorch Hivemind is a PyTorch library to train large neural networks across the Internet. Its intended usage

1.3k Jan 08, 2023
A benchmark of data-centric tasks from across the machine learning lifecycle.

A benchmark of data-centric tasks from across the machine learning lifecycle.

61 Dec 28, 2022
My project contrasts K-Nearest Neighbors and Random Forrest Regressors on Real World data

kNN-vs-RFR My project contrasts K-Nearest Neighbors and Random Forrest Regressors on Real World data In many areas, rental bikes have been launched to

1 Oct 28, 2021
Convoys is a simple library that fits a few statistical model useful for modeling time-lagged conversions.

Convoys is a simple library that fits a few statistical model useful for modeling time-lagged conversions. There is a lot more info if you head over to the documentation. You can also take a look at

Better 240 Dec 26, 2022
Tools for diffing and merging of Jupyter notebooks.

nbdime provides tools for diffing and merging of Jupyter Notebooks.

Project Jupyter 2.3k Jan 03, 2023
SIMD-accelerated bitwise hamming distance Python module for hexidecimal strings

hexhamming What does it do? This module performs a fast bitwise hamming distance of two hexadecimal strings. This looks like: DEADBEEF = 1101111010101

Michael Recachinas 12 Oct 14, 2022
MosaicML Composer contains a library of methods, and ways to compose them together for more efficient ML training

MosaicML Composer MosaicML Composer contains a library of methods, and ways to compose them together for more efficient ML training. We aim to ease th

MosaicML 2.8k Jan 06, 2023
Auto updating website that tracks closed & open issues/PRs on scikit-learn/scikit-learn.

Repository Status for Scikit-learn Live webpage Auto updating website that tracks closed & open issues/PRs on scikit-learn/scikit-learn. Running local

Thomas J. Fan 6 Dec 27, 2022
Official code for HH-VAEM

HH-VAEM This repository contains the official Pytorch implementation of the Hierarchical Hamiltonian VAE for Mixed-type Data (HH-VAEM) model and the s

Ignacio Peis 8 Nov 30, 2022
Distributed Evolutionary Algorithms in Python

DEAP DEAP is a novel evolutionary computation framework for rapid prototyping and testing of ideas. It seeks to make algorithms explicit and data stru

Distributed Evolutionary Algorithms in Python 4.9k Jan 05, 2023
Practical Time-Series Analysis, published by Packt

Practical Time-Series Analysis This is the code repository for Practical Time-Series Analysis, published by Packt. It contains all the supporting proj

Packt 325 Dec 23, 2022
Module for statistical learning, with a particular emphasis on time-dependent modelling

Operating system Build Status Linux/Mac Windows tick tick is a Python 3 module for statistical learning, with a particular emphasis on time-dependent

X - Data Science Initiative 410 Dec 14, 2022
Titanic Traveller Survivability Prediction

The aim of the mini project is predict whether or not a passenger survived based on attributes such as their age, sex, passenger class, where they embarked and more.

John Phillip 0 Jan 20, 2022
Confidence intervals for scikit-learn forest algorithms

forest-confidence-interval: Confidence intervals for Forest algorithms Forest algorithms are powerful ensemble methods for classification and regressi

272 Dec 01, 2022