A package that helps you choose a Python global optimizer package, and strategy therein, from Ax-Platform, bayesian-optimization, DLib, HyperOpt, NeverGrad, Optuna, Platypus, PyMoo, PySOT, Scipy classic and shgo, Skopt, nlopt, Py-Bobyaq, and UltraOpt.
-
50+ strategies are assigned Elo ratings by sister repo optimizer-elo-ratings. All are presented in a common calling syntax. By all means contribute more to optimizers.
-
Pass the dimensions of the problem, function evaluation budget and time budget to receive suggestions that are independent of your problem set,
from pprint import pprint from humpday import suggest pprint(suggest(n_dim=5, n_trials=130,n_seconds=5*60))
where n_seconds is the total computation budget for the optimizer (not the objective function) over all 130 function evaluations.
-
Or simply pass your objective function, and it will time it and do something sensible:
from humpday import recommend def my_objective(u): time.sleep(0.01) return u[0]*math.sin(u[1]) recommendations = recommend(my_objective, n_dim=21, n_trials=130)
-
If you are feeling lucky, the meta minimizer which will choose an optimizer based only on dimension and number of function evaluations, then run it:
from humpday import minimize best_val, best_x = minimize(objective, n_dim=13, n_trials=130 )
Here and elsewhere, objective is intended to be minimized on the hypercube [0,1]^n_dim.
-
Better yet, call points_race on a list of your own objective functions:
from humpday import points_race points_race(objectives=[my_objective]*2,n_dim=5, n_trials=100)
Here is a notebook you can open in colab and run, illustrating the points race.
pip install humpday
Bleeding edge:
pip install git+https://github.com/microprediction/humpday
File an issue if you have problems. If you get a CMake error, try:
Install directly if you want them to be included (dlib is strongly recommended, but cmake is broken on some operating systems)
pip install cmake
pip install ultraopt
pip install hyperopt
pip install dlib