项目作者: jbrea

项目描述 :
Bayesian optimization for Julia
高级语言: Julia
项目地址: git://github.com/jbrea/BayesianOptimization.jl.git
创建时间: 2018-10-12T07:19:28Z
项目社区:https://github.com/jbrea/BayesianOptimization.jl

开源协议:Other

下载


BayesianOptimization

Lifecycle
Build Status
Build status
codecov.io

Usage

  1. using BayesianOptimization, GaussianProcesses, Distributions
  2. f(x) = sum((x .- 1).^2) + randn() # noisy function to minimize
  3. # Choose as a model an elastic GP with input dimensions 2.
  4. # The GP is called elastic, because data can be appended efficiently.
  5. model = ElasticGPE(2, # 2 input dimensions
  6. mean = MeanConst(0.),
  7. kernel = SEArd([0., 0.], 5.),
  8. logNoise = 0.,
  9. capacity = 3000) # the initial capacity of the GP is 3000 samples.
  10. set_priors!(model.mean, [Normal(1, 2)])
  11. # Optimize the hyperparameters of the GP using maximum a posteriori (MAP) estimates every 50 steps
  12. modeloptimizer = MAPGPOptimizer(every = 50, noisebounds = [-4, 3], # bounds of the logNoise
  13. kernbounds = [[-1, -1, 0], [4, 4, 10]], # bounds of the 3 parameters GaussianProcesses.get_param_names(model.kernel)
  14. maxeval = 40)
  15. opt = BOpt(f,
  16. model,
  17. UpperConfidenceBound(), # type of acquisition
  18. modeloptimizer,
  19. [-5., -5.], [5., 5.], # lowerbounds, upperbounds
  20. repetitions = 5, # evaluate the function for each input 5 times
  21. maxiterations = 100, # evaluate at 100 input positions
  22. sense = Min, # minimize the function
  23. acquisitionoptions = (method = :LD_LBFGS, # run optimization of acquisition function with NLopts :LD_LBFGS method
  24. restarts = 5, # run the NLopt method from 5 random initial conditions each time.
  25. maxtime = 0.1, # run the NLopt method for at most 0.1 second each time
  26. maxeval = 1000), # run the NLopt methods for at most 1000 iterations (for other options see https://github.com/JuliaOpt/NLopt.jl)
  27. verbosity = Progress)
  28. result = boptimize!(opt)

Resume optimization

To continue the optimization, one can call boptimize!(opt) multiple times.

  1. result = boptimize!(opt) # first time (includes initialization)
  2. result = boptimize!(opt) # restart
  3. maxiterations!(opt, 50) # set maxiterations for the next call
  4. result = boptimize!(opt) # restart again

(Warm-)start with some known function values

By default, the first 5*length(lowerbounds) input points are sampled from a
Sobol sequence. If instead one has already some function values available and
wants to skip the initialization with the Sobol sequence, one can update the
model with the available data and set initializer_iterations = 0. For example
(continuing the above example after setting the modeloptimizer).

  1. x = [rand(2) for _ in 1:20]
  2. y = -f.(x)
  3. append!(model, hcat(x...), y)
  4. opt = BOpt(f,
  5. model,
  6. UpperConfidenceBound(),
  7. modeloptimizer,
  8. [-5., -5.], [5., 5.],
  9. maxiterations = 100,
  10. sense = Min,
  11. initializer_iterations = 0
  12. )
  13. result = boptimize!(opt)

This package exports

  • BOpt, boptimize!, optimize
  • acquisition types: ExpectedImprovement, ProbabilityOfImprovement, UpperConfidenceBound, ThompsonSamplingSimple, MutualInformation
  • scaling of standard deviation in UpperConfidenceBound: BrochuBetaScaling, NoBetaScaling
  • GP hyperparameter optimizer: MAPGPOptimizer, NoModelOptimizer
  • Initializer: ScaledSobolIterator, ScaledLHSIterator
  • optimization sense: Min, Max
  • verbosity levels: Silent, Timings, Progress
  • helper: maxduration!, maxiterations!

Use the REPL help, e.g. ?Bopt, to get more information.

Review papers on Bayesian optimization

Similar Projects

BayesOpt is a wrapper of the established
BayesOpt toolbox written in C++.

Dragonfly is a feature-rich package
for scalable Bayesian optimization written in Python. Use it in Julia with
PyCall.