Bayesian Optimization

Bayesian Optimization#

Type of optimizer algorithm: continuous inputs

About#

Bayesian Optimization is a sample-efficient black box optimization algorithm which uses an uncertainty-aware approximation \(\tilde{f}(\boldsymbol{x})\) of the objective function \(f\). This surrogate model \(\tilde{f}\) is usually a Gaussian Process, whose predictions and uncertainties are used to build an acquisition function \(\alpha(\boldsymbol{x})\). Optimizing \(\alpha\) renders points that are likely to perform well for \(f\). By smartly including uncertainties in \(\alpha\), Bayesian Optimization balances exploration and exploitation.

Our implementation uses gpytorch and botorch as the engines for Bayesian Optimization [Balandat et al., 2020, Gardner et al., 2018]. We use the default botorch single-task Gaussian Process, and we optimize the acquisition function using grid-search for 1 and 2 dimensions, and using botorch’s utilities from 3 onwards.

How to run#

import numpy as np

from poli.objective_repository import ToyContinuousBlackBox

from poli_baselines.solvers import VanillaBayesianOptimization

f_ackley = ToyContinuousBlackBox(function_name="ackley_function_01", n_dimensions=2)

x0 = np.random.randn(2).reshape(1, -1).clip(-2.0, 2.0)
y0 = f_ackley(x0)

bo_solver = VanillaBayesianOptimization(
    black_box=f_ackley,
    x0=x0,
    y0=y0,
)

bo_solver.solve(max_iter=10)

See more#