Code Monkey home page Code Monkey logo

mlbayesopt's People

Contributors

ck37 avatar jdreaver avatar ymattu avatar

Stargazers

 avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar

Watchers

 avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar

mlbayesopt's Issues

recipes friendly

when using recipes package, make it available to use recipe object in MlBayesOpt functions
like

rec <- recipes::recipe(data, y ~ .) %>%
  step_****() %>%
  step_****()
res <- xgb_cv_opt(recipe = rec)

Random Forest optimization

Hello,

Interesting package, I'd like to give it a try sometime soon. Re: the random forest implementation, I'd like to suggest a few changes:

  1. Minobspernode - it would be great to support optimizing over the minimum number of observations per node hyperparameter, because that can be used to reduce overfitting in random forests.

  2. Ntree - I don't think there is a point in optimization over ntree. Breiman proved in his 2001 RF article that there is no problem with increasing ntree to an arbitrary number - it just converges to a performance plateau (section ~2.1) and ends up wasting computation. So I don't see any benefit to optimizing the ntree as there is not any harm in a larger number of trees (unlike GBM).

What do you think?

Appreciate it,
Chris

Error in table(testlabel, t.pred$predictions) : all arguments must have the same length

Hi,
I tried to run rf_opt where my train has dim=101673, 10 whilst my test dim=43574, 10.
I face this error:
Error in table(testlabel, t.pred$predictions) :
all arguments must have the same length
Timing stopped at: 4.31 0.22 1.81

I've checked there's no NA values on both dataset.
All my variables are factor.

When i run iris_train and iris_test it run smoothly. when i use iris_train as is and sampled the iris_test to 30 it still work.
when I sample my train and test both to 1000 rf_opt run smoothly.

So I'm kind of confussed here. If it because of different number of rows why on iris dataset with different number of rows it still work.

thx.

caret friendly

so far → dependent on e1071, ranger, xgboost packages
improved → caret package only

Error in GP_deviance(param_init_200d[i, ], X, Y, nug_thres, corr = corr) : Infinite values of the Deviance Function, unable to find optimum parameters

When using the xgb_cv_opt function I get the following error after four rounds.

Error in GP_deviance(param_init_200d[i, ], X, Y, nug_thres, corr = corr) : Infinite values of the Deviance Function, unable to find optimum parameters

I found this page

yanyachen/rBayesianOptimization#36

talking about the same error in the rbayesianoptimization package and tried playing around with the parameter ranges but still cannot get the optimization to run. Reproducible example below.

df_example <- read.csv(textConnection("V2,V3,V4,V5,V6,V7,V8,V9,V10,V11,V12,V13,V14,V15,V16,V17,V18,V19,V20,V21,V22,V23,V24,V25,V26,V27,V28,V29,V30,V31,V32,V33,V34,V35,V36,V37,V38,V39,V40,V41,V42,V43,V44,V45,V46,V47,V48,V49,V50,V51,V52,V53,V54,V55,V56,V57,V58,V59,V60,V61,V62,V63,V64,V65,V66,V67,V68,V69,V70,V71,V72,V73,y
                        1.027244757,-0.362509685,-0.14182585,0,1,0,1,1,0,0,0.098083245,0.155305631,0,0,1,1,0,0,0,0,0,0,1,0,0,1,0,1,0,0,0,0,0,0,0,1,1,0,0,0,0,0,0,1,0,0,0,0,0,1,0,0,1,0,0,0,0,0,0,0,0,1,0,0,0,0,1,0,0,0,1,0,1
                        0.140333592,-0.362509685,-0.14182585,0,1,0,1,1,0,0,0.294338347,0.066774355,0,1,0,0,0,0,1,0,0,1,0,0,1,0,0,1,0,0,0,0,0,1,0,0,1,0,0,0,0,0,0,1,0,0,0,0,0,0,1,0,0,0,0,0,0,1,0,0,1,0,0,0,0,1,0,0,0,0,0,1,0
                        0.140333592,-0.362509685,-0.22656054,0,0,0,0,0,0,1,-0.511186324,-0.068217098,0,0,1,1,0,0,0,0,0,0,1,0,1,0,0,1,0,0,0,0,0,1,0,0,0,0,0,1,0,0,0,0,1,0,0,0,0,0,1,0,0,0,1,0,0,0,0,1,0,0,0,0,1,0,0,0,0,1,0,0,0
                        0.140333592,-0.362509685,-0.22656054,0,0,0,0,0,0,0,0.437868197,-0.068217098,0,0,1,1,0,0,0,0,0,0,1,0,0,1,0,1,0,0,0,0,0,0,0,1,1,0,0,0,0,0,0,1,0,0,0,0,0,0,1,0,0,0,0,0,1,0,0,0,1,0,0,0,0,1,0,0,0,0,0,1,1
                        1.027244757,-0.362509685,-0.14182585,0,1,0,0,0,0,1,1.205313519,0.39842224,1,0,0,1,0,0,0,0,0,0,1,0,0,1,0,1,0,0,0,0,0,1,0,0,0,0,0,1,0,0,0,0,1,0,0,0,0,0,1,0,0,1,0,0,0,0,0,1,0,0,0,0,1,0,0,0,0,1,0,0,0
                        -0.923959807,-0.362509685,-0.22656054,0,0,0,0,0,0,0,0.644375431,-0.068217098,0,0,1,0,1,0,0,0,0,1,0,0,1,0,1,0,0,0,1,0,0,0,0,0,1,0,0,0,0,0,0,1,0,0,0,0,0,0,1,0,0,0,0,0,0,1,0,1,0,0,0,0,1,0,0,0,0,1,0,0,0
                        1.027244757,-0.362509685,-0.22656054,0,0,0,0,0,0,1,0.288479985,-0.068217098,0,1,0,1,0,0,0,0,0,0,1,0,0,1,0,1,0,0,1,0,0,0,0,0,0,0,0,1,0,0,0,0,1,0,0,0,0,0,1,0,1,0,0,0,0,0,0,1,0,0,0,0,1,0,0,0,0,1,0,0,0
                        0.140333592,-0.362509685,-0.22656054,0,0,0,0,0,0,0,-0.171401372,-0.068217098,0,1,0,1,0,0,0,0,0,1,0,0,1,0,0,1,0,0,1,0,0,0,0,0,1,0,0,0,0,0,0,1,0,0,0,0,0,0,1,0,0,0,1,0,0,0,0,0,0,1,0,0,0,0,1,0,0,0,0,1,0
                        -2.165635439,-0.362509685,-0.22656054,0,0,0,0,0,0,0,1.427931246,-0.068217098,0,1,0,1,0,0,0,0,0,0,1,0,0,1,0,1,0,0,0,0,0,1,0,0,1,0,0,0,0,0,0,1,0,0,0,0,0,0,1,0,0,1,0,0,0,0,0,0,1,0,0,0,0,0,1,0,0,0,0,1,1
                        -2.165635439,-0.362509685,0.2818476,1,0,1,0,0,0,0,0.013137007,-0.068217098,0,1,0,1,0,0,0,1,0,0,0,0,1,0,0,1,0,0,0,0,0,0,0,1,0,0,1,0,0,0,0,0,0,1,0,0,0,0,1,0,1,0,0,0,0,0,0,0,1,0,0,0,1,0,0,0,0,0,0,1,0
                        -0.923959807,1.77708902,-0.05709116,0,0,0,0,0,0,0,-0.374979425,-0.068217098,0,1,0,1,0,0,0,0,0,0,1,0,0,1,0,1,0,0,1,0,0,0,0,0,1,0,0,0,0,0,0,1,0,0,0,0,0,0,1,0,0,0,0,0,0,1,0,0,1,0,0,0,0,1,0,0,0,0,0,1,1
                        1.027244757,-0.362509685,-0.22656054,0,0,0,0,0,0,0,-0.670826667,-0.068217098,0,0,1,1,0,0,0,0,0,0,1,0,0,1,1,0,0,0,0,1,0,0,0,0,1,0,0,0,0,0,0,1,0,0,0,0,0,0,1,0,0,1,0,0,0,0,0,0,0,1,0,0,0,0,1,0,0,1,0,0,0
                        1.027244757,3.916687725,7.56903095,0,0,0,0,0,0,1,-0.875869311,-0.068217098,0,1,0,0,1,0,0,0,0,0,1,0,0,1,0,1,0,0,0,0,0,1,0,0,0,0,0,1,0,0,0,0,1,0,0,0,0,0,1,0,1,0,0,0,0,0,0,1,0,0,0,0,1,0,0,0,0,1,0,0,1
                        0.140333592,-0.362509685,-0.22656054,0,0,0,0,0,1,0,-0.711835196,-0.068217098,0,1,0,1,0,0,0,0,0,0,1,0,0,1,0,1,0,0,0,0,0,1,0,0,0,0,0,0,0,1,0,0,0,0,0,0,1,0,1,0,0,0,0,0,0,1,0,0,0,1,0,0,0,0,1,0,0,0,0,1,0
                        1.027244757,-0.362509685,-0.22656054,0,0,0,0,0,0,1,0.719069536,-0.068217098,0,1,0,0,1,0,0,0,0,1,0,1,0,0,0,1,0,0,1,0,0,0,0,0,0,0,0,1,0,0,0,0,1,0,0,0,0,0,1,0,0,1,0,0,0,0,0,1,0,0,0,0,1,0,0,0,0,1,0,0,0
                        1.027244757,-0.362509685,-0.22656054,0,0,0,0,0,0,1,-0.960815549,-0.068217098,0,1,0,0,1,0,0,0,0,1,0,0,1,0,0,1,0,0,0,0,1,0,0,0,0,0,0,1,0,0,0,0,1,0,0,0,0,0,1,0,0,0,0,0,0,1,0,1,0,0,0,0,1,0,0,0,0,1,0,0,1
                        -0.923959807,-0.362509685,-0.05709116,0,0,0,0,0,0,1,0.188887844,-0.068217098,0,0,1,0,1,0,0,0,0,1,0,0,0,1,1,0,0,0,1,0,0,0,0,0,0,0,0,1,0,0,0,0,1,0,0,0,0,1,0,1,0,0,0,0,0,0,1,0,0,0,0,1,0,0,0,0,1,0,0,0,0
                        1.027244757,1.77708902,0.536051671,0,0,0,0,0,0,0,-0.650322403,-0.068217098,0,0,1,1,0,0,0,0,0,0,1,0,0,1,0,0,1,0,0,0,0,0,0,1,1,0,0,0,0,0,0,1,0,0,0,0,0,0,1,1,0,0,0,0,0,0,0,0,0,1,0,0,0,0,1,0,0,0,1,0,0
                        -0.923959807,-0.362509685,-0.22656054,0,0,0,0,0,0,1,-0.44674435,-0.068217098,0,0,1,1,0,0,0,0,0,0,1,0,1,0,0,0,1,0,0,0,0,0,1,0,0,0,0,1,0,0,0,0,1,0,0,0,0,0,1,1,0,0,0,0,0,0,1,0,0,0,0,1,0,0,0,0,0,0,1,0,1
                        -0.923959807,-0.362509685,-0.22656054,0,0,0,0,0,0,1,0.979766611,-0.068217098,0,0,1,1,0,0,0,0,0,0,1,0,0,1,0,1,0,0,0,0,0,0,0,1,0,0,0,1,0,0,0,0,1,0,0,0,0,0,1,0,0,1,0,0,0,0,1,0,0,0,0,1,0,0,0,0,0,1,0,0,1
                        -0.923959807,-0.362509685,-0.22656054,0,0,0,0,0,0,0,3.725873442,-0.068217098,0,1,0,0,1,0,0,0,0,1,0,1,0,0,1,0,0,0,0,0,0,0,1,0,0,0,1,0,0,0,0,0,0,1,0,0,0,0,1,0,1,0,0,0,0,0,0,0,0,1,0,0,0,1,0,0,0,1,0,0,0
                        -2.165635439,-0.362509685,-0.22656054,0,0,0,0,0,0,0,-0.952028007,-0.068217098,1,0,0,1,0,0,0,1,0,0,0,1,0,0,0,0,1,0,1,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,1,0,1,1,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,1,0,0,1,0,0
                        -0.923959807,1.77708902,-0.22656054,0,0,0,0,0,0,1,1.323945334,-0.068217098,0,0,1,0,0,0,1,0,0,0,1,0,0,1,0,1,0,0,1,0,0,0,0,0,0,0,0,1,0,0,0,0,1,0,0,0,0,0,1,1,0,0,0,0,0,0,1,0,0,0,0,1,0,0,0,0,0,1,0,0,1
                        -0.923959807,-0.362509685,-0.22656054,0,0,0,0,0,0,0,-0.515580095,-0.068217098,0,0,1,1,0,0,0,0,0,0,1,0,0,1,0,1,0,0,1,0,0,0,0,0,1,0,0,0,0,0,0,1,0,0,0,0,0,0,1,0,0,0,0,0,0,1,0,0,1,0,0,0,0,1,0,0,0,0,0,1,1
                        1.027244757,1.77708902,0.02764353,0,1,0,0,0,0,1,-0.53901354,-0.030471007,0,1,0,1,0,0,0,1,0,0,0,0,0,1,0,0,1,0,0,0,0,0,1,0,0,0,0,1,0,0,0,0,1,0,0,0,0,0,1,1,0,0,0,0,0,0,0,1,0,0,0,0,1,0,0,0,0,0,1,0,0
                        -0.923959807,-0.362509685,-0.22656054,0,0,0,0,0,0,0,-0.273922694,-0.068217098,0,1,0,0,1,0,0,0,0,1,0,0,1,0,0,1,0,0,0,0,0,0,0,1,1,0,0,0,0,0,0,1,0,0,0,0,0,0,1,0,0,0,0,0,1,0,0,0,0,1,0,0,0,0,1,0,0,0,0,1,0
                        0.140333592,-0.362509685,-0.22656054,0,0,0,0,0,0,0,-0.744056183,-0.068217098,0,0,1,0,1,0,0,0,0,1,0,0,1,0,0,1,0,0,0,0,0,1,0,0,1,0,0,0,0,0,0,1,0,0,0,0,0,0,1,0,1,0,0,0,0,0,0,0,0,0,1,0,0,0,0,1,0,0,0,1,0
                        0.140333592,-0.362509685,-0.05709116,0,0,0,0,0,0,1,-0.735268641,-0.068217098,0,0,1,0,0,0,1,0,0,0,1,0,1,0,0,0,1,0,0,0,0,0,1,0,0,0,0,1,0,0,0,0,1,0,0,0,0,0,1,0,0,0,0,0,1,0,0,0,1,0,0,0,0,1,0,0,0,0,1,0,0
                        0.140333592,-0.362509685,-0.22656054,0,0,0,0,0,0,0,2.809039908,-0.068217098,0,0,1,1,0,0,0,0,0,1,0,0,1,0,1,0,0,0,0,0,0,1,0,0,1,0,0,0,0,0,0,1,0,0,0,0,0,1,0,0,0,0,0,0,0,1,0,0,0,1,0,0,0,0,1,0,0,1,0,0,1
                        -2.165635439,-0.362509685,-0.22656054,0,0,1,0,0,0,0,-0.976926042,-0.068217098,0,1,0,1,0,0,0,0,0,0,1,0,0,1,0,0,1,0,0,0,0,0,1,0,1,0,0,0,0,0,0,0,0,0,0,1,0,0,1,0,0,0,0,1,0,0,0,0,0,0,1,0,0,0,0,1,0,0,1,0,0
                        0.140333592,1.77708902,-0.22656054,0,0,0,0,0,0,1,-0.751379134,-0.068217098,0,0,1,0,0,0,1,1,0,0,0,1,0,0,0,0,1,0,1,0,0,0,0,0,0,0,0,1,0,0,0,0,1,0,0,0,0,0,1,0,0,0,0,0,0,1,1,0,0,0,0,1,0,0,0,0,0,0,1,0,0
                        0.140333592,-0.362509685,-0.22656054,0,0,0,0,0,1,0,1.262432541,-0.068217098,0,1,0,1,0,0,0,1,0,0,0,1,0,0,0,1,0,0,1,0,0,0,0,0,1,0,0,0,0,0,0,1,0,0,0,0,0,0,1,0,0,0,0,0,1,0,0,0,0,1,0,0,0,0,1,0,0,0,0,1,0
                        0.140333592,1.77708902,-0.05709116,0,0,0,0,0,0,1,-0.328112535,-0.068217098,0,1,0,1,0,0,0,0,0,0,1,0,0,1,0,0,0,1,0,1,0,0,0,0,0,0,0,1,0,0,0,0,1,0,0,0,0,0,1,0,0,0,0,0,0,1,0,1,0,0,0,0,1,0,0,0,1,0,0,0,1
                        -2.165635439,-0.362509685,-0.22656054,0,0,1,0,0,0,0,0.427616065,-0.068217098,1,0,0,0,1,0,0,0,0,0,1,0,0,1,0,0,0,1,1,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,1,0,1,0,0,0,0,0,1,0,0,0,0,0,1,0,0,0,0,1,0,0,1,0,0
                        0.140333592,-0.362509685,-0.22656054,0,0,0,0,0,0,1,-0.380837786,-0.068217098,0,1,0,0,0,0,1,0,0,0,1,0,1,0,0,1,0,0,1,0,0,0,0,0,0,0,0,1,0,0,0,0,1,0,0,0,0,0,1,0,0,0,0,0,0,1,0,1,0,0,0,0,1,0,0,0,0,1,0,0,0
                        0.140333592,-0.362509685,-0.22656054,0,0,0,0,0,0,0,-0.300285319,-0.068217098,0,1,0,1,0,0,0,0,0,0,1,0,0,1,0,1,0,0,0,0,0,0,0,1,1,0,0,0,0,0,0,1,0,0,0,0,0,0,1,0,0,0,1,0,0,0,0,0,1,0,0,0,0,1,0,0,0,0,0,1,1
                        -2.165635439,-0.362509685,-0.22656054,0,0,0,0,0,0,0,-0.650322403,-0.068217098,1,0,0,0,1,0,0,0,0,0,1,0,0,1,0,1,0,0,1,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,1,0,0,1,0,0,0,0,0,1,0,0,0,0,0,1,0,0,0,0,1,0,0,0,1,0
                        0.140333592,-0.362509685,-0.22656054,0,0,0,0,0,0,0,-0.335435487,-0.068217098,1,0,0,0,1,0,0,0,0,0,1,0,0,1,0,1,0,0,1,0,0,0,0,0,1,0,0,0,0,0,0,1,0,0,0,0,0,0,1,0,0,0,0,1,0,0,0,0,0,1,0,0,0,0,1,0,0,0,0,1,0
                        0.140333592,1.77708902,0.11237822,0,0,0,0,1,0,1,-0.654716174,0.954291261,0,0,1,1,0,0,0,0,0,0,1,0,0,1,0,1,0,0,1,0,0,0,0,0,0,0,0,1,0,0,0,0,1,0,0,0,0,0,1,1,0,0,0,0,0,0,1,0,0,0,0,1,0,0,0,0,0,1,0,0,0
                        1.027244757,-0.362509685,-0.22656054,0,0,0,0,0,0,1,-0.809962747,-0.068217098,0,1,0,1,0,0,0,0,0,0,1,0,0,1,1,0,0,0,0,1,0,0,0,0,0,0,0,1,0,0,0,0,1,0,0,0,0,0,1,0,0,1,0,0,0,0,0,1,0,0,0,0,1,0,0,0,1,0,0,0,0
                        -0.923959807,-0.362509685,-0.22656054,0,0,0,0,0,1,1,-0.177259733,-0.068217098,0,1,0,0,1,0,0,0,1,0,0,1,0,0,0,0,1,0,0,0,0,0,1,0,0,0,0,1,0,0,0,0,1,0,0,0,0,0,1,1,0,0,0,0,0,0,1,0,0,0,0,1,0,0,0,0,0,0,1,0,1
                        1.027244757,-0.362509685,0.705521051,0,1,0,1,1,1,1,0.54917706,1.999725356,1,0,0,1,0,0,0,0,0,0,1,0,0,1,0,1,0,0,0,0,0,1,0,0,0,0,0,1,0,0,0,0,1,0,0,0,0,0,1,0,0,1,0,0,0,0,0,0,1,0,0,0,0,1,0,0,0,1,0,0,0
                        1.027244757,-0.362509685,-0.22656054,0,0,0,0,0,0,1,1.318086973,-0.068217098,0,1,0,1,0,0,0,1,0,0,0,0,1,0,0,0,0,1,0,0,0,1,0,0,0,0,0,1,0,0,0,0,1,0,0,0,0,0,1,0,0,1,0,0,0,0,0,1,0,0,0,0,1,0,0,0,1,0,0,0,0
                        0.140333592,-0.362509685,-0.22656054,0,0,0,0,0,0,0,-0.679614209,-0.068217098,1,0,0,1,0,0,0,0,0,0,1,0,0,1,0,1,0,0,1,0,0,0,0,0,1,0,0,0,0,0,0,1,0,0,0,0,0,0,1,0,0,0,1,0,0,0,0,0,0,0,1,0,0,0,0,1,0,0,0,1,0
                        1.027244757,-0.362509685,-0.14182585,0,1,0,1,1,0,1,0.467160003,0.206564964,0,0,1,0,0,0,1,1,0,0,0,0,1,0,0,1,0,0,1,0,0,0,0,0,0,0,0,1,0,0,0,0,1,0,0,0,0,0,1,0,0,0,0,0,1,0,0,1,0,0,0,0,1,0,0,0,0,1,0,0,0
                        0.140333592,-0.362509685,-0.14182585,0,0,0,0,0,0,1,2.415065115,-0.068217098,0,1,0,1,0,0,0,0,0,0,1,1,0,0,1,0,0,0,0,0,0,0,1,0,0,0,0,1,0,0,0,0,1,0,0,0,0,0,1,0,0,0,0,0,1,0,1,0,0,0,0,1,0,0,0,0,1,0,0,0,0
                        -2.165635439,-0.362509685,0.11237822,0,0,0,1,1,0,0,-0.625424368,0.012099869,1,0,0,1,0,0,0,0,0,1,0,0,1,0,0,0,1,0,1,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,1,0,0,1,0,0,0,0,1,0,0,0,0,0,0,1,0,0,0,0,1,0,0,1,0,0
                        -0.923959807,-0.362509685,-0.22656054,0,0,0,0,0,0,1,0.448120329,-0.068217098,0,1,0,1,0,0,0,0,0,0,1,0,0,1,0,1,0,0,1,0,0,0,0,0,0,0,0,1,0,0,0,0,1,0,0,0,0,0,1,0,1,0,0,0,0,0,1,0,0,0,0,1,0,0,0,0,0,1,0,0,0
                        1.027244757,-0.362509685,-0.22656054,0,0,0,0,0,0,1,-0.73087487,-0.068217098,0,1,0,1,0,0,0,0,0,0,1,0,1,0,0,1,0,0,1,0,0,0,0,0,0,0,0,1,0,0,0,0,1,0,0,0,0,0,1,0,0,0,0,1,0,0,0,1,0,0,0,0,1,0,0,0,0,1,0,0,0
                        -0.923959807,-0.362509685,-0.22656054,0,0,0,0,0,0,1,0.814267906,-0.068217098,0,1,0,1,0,0,0,0,0,0,1,0,0,1,0,1,0,0,0,0,0,0,0,1,0,0,0,1,0,0,0,0,1,0,0,0,0,0,1,1,0,0,0,0,0,0,1,0,0,0,0,1,0,0,0,0,0,1,0,0,0
                        -0.923959807,-0.362509685,-0.22656054,0,0,0,0,0,0,0,-0.411594183,-0.068217098,0,0,1,0,0,1,0,0,0,0,1,0,0,1,0,1,0,0,0,0,0,1,0,0,1,0,0,0,0,0,0,1,0,0,0,0,0,0,1,0,0,0,1,0,0,0,0,0,1,0,0,0,0,1,0,0,0,0,1,0,0
                        1.027244757,-0.362509685,-0.22656054,0,0,0,0,0,0,1,-0.246095478,-0.068217098,0,1,0,0,1,0,0,0,0,1,0,0,1,0,0,1,0,0,0,0,0,1,0,0,0,0,0,1,0,0,0,0,1,0,0,0,0,0,1,1,0,0,0,0,0,0,0,1,0,0,0,0,1,0,0,0,0,1,0,0,0
                        -0.923959807,-0.362509685,-0.22656054,0,0,0,0,0,0,1,-0.291497777,-0.068217098,0,1,0,0,1,0,0,0,0,0,1,0,0,1,0,0,1,0,0,0,0,0,1,0,0,0,0,1,0,0,0,0,1,0,0,0,0,0,1,0,0,0,1,0,0,0,1,0,0,0,0,1,0,0,0,0,0,0,1,0,0
                        1.027244757,-0.362509685,-0.22656054,0,0,0,0,0,0,1,-0.714764377,-0.068217098,0,1,0,1,0,0,0,0,1,0,0,0,0,1,0,0,1,0,0,0,0,0,0,1,0,0,0,1,0,0,0,0,1,0,0,0,0,0,1,1,0,0,0,0,0,0,0,0,1,0,0,0,0,1,0,0,0,0,1,0,1
                        0.140333592,-0.362509685,-0.22656054,0,0,0,0,0,0,1,-0.58880961,-0.068217098,0,1,0,0,0,0,1,0,0,1,0,0,1,0,0,1,0,0,1,0,0,0,0,0,0,0,0,1,0,0,0,0,1,0,0,0,0,0,1,0,0,0,0,1,0,0,0,1,0,0,0,0,1,0,0,0,0,1,0,0,0
                        0.140333592,-0.362509685,0.11237822,0,1,0,0,0,1,0,-0.672291258,-0.035109976,0,1,0,0,1,0,0,0,0,1,0,0,0,1,0,0,1,0,0,0,0,0,1,0,1,0,0,0,0,0,0,1,0,0,0,0,0,0,1,0,1,0,0,0,0,0,0,0,0,1,0,0,0,0,1,0,0,0,1,0,0
                        0.140333592,-0.362509685,-0.22656054,0,0,0,0,0,0,1,-0.908090298,-0.068217098,0,1,0,1,0,0,0,0,0,0,1,0,0,1,0,1,0,0,0,0,0,0,0,1,0,0,0,1,0,0,0,0,1,0,0,0,0,0,1,0,0,0,0,0,1,0,0,1,0,0,0,0,1,0,0,0,0,1,0,0,0
                        0.140333592,-0.362509685,-0.22656054,0,0,0,0,0,0,0,-0.301749909,-0.068217098,0,0,1,1,0,0,0,1,0,0,0,1,0,0,0,0,1,0,1,0,0,0,0,0,1,0,0,0,0,0,0,1,0,0,0,0,0,0,1,1,0,0,0,0,0,0,0,0,0,1,0,0,0,0,1,0,0,0,1,0,0
                        0.140333592,-0.362509685,-0.22656054,0,0,0,0,0,0,0,-0.376444015,-0.068217098,0,0,1,0,1,0,0,0,0,0,1,0,0,1,0,1,0,0,0,0,0,0,0,1,1,0,0,0,0,0,0,1,0,0,0,0,0,0,1,0,0,0,0,0,1,0,0,0,1,0,0,0,0,1,0,0,0,0,0,1,0
                        1.027244757,1.77708902,-0.05709116,0,1,0,0,0,0,1,-0.44527976,-0.043375544,0,1,0,1,0,0,0,0,0,0,1,1,0,0,0,1,0,0,1,0,0,0,0,0,0,0,0,1,0,0,0,0,1,0,0,0,0,0,1,0,0,0,0,1,0,0,0,1,0,0,0,0,1,0,0,0,0,1,0,0,1
                        1.027244757,-0.362509685,2.230745473,0,1,0,0,1,1,1,0.981231202,-0.068217098,0,0,1,1,0,0,0,0,0,0,1,0,0,1,0,0,0,1,1,0,0,0,0,0,0,0,0,1,0,0,0,0,1,0,0,0,0,0,1,0,0,0,1,0,0,0,0,1,0,0,0,0,1,0,0,0,0,0,0,1,0
                        0.140333592,-0.362509685,-0.22656054,0,0,1,0,0,1,0,-0.708906016,-0.068217098,0,1,0,0,1,0,0,0,0,1,0,1,0,0,0,0,1,0,1,0,0,0,0,0,1,0,0,0,0,0,0,1,0,0,0,0,0,0,1,0,0,1,0,0,0,0,0,0,0,1,0,0,0,1,0,0,0,0,1,0,0
                        0.140333592,-0.362509685,-0.22656054,0,0,0,0,1,0,0,-0.125999072,-0.065276094,0,0,1,1,0,0,0,0,0,0,1,0,1,0,1,0,0,0,0,0,0,0,1,0,1,0,0,0,0,0,0,1,0,0,0,0,0,1,0,0,1,0,0,0,0,0,0,0,1,0,0,0,0,1,0,0,0,1,0,0,0
                        0.140333592,-0.362509685,-0.22656054,0,0,0,0,0,0,1,1.647619793,-0.068217098,0,0,1,0,1,0,0,0,0,0,1,0,0,1,0,1,0,0,0,0,0,0,0,1,0,0,0,1,0,0,0,0,1,0,0,0,0,0,1,0,0,0,0,0,1,0,1,0,0,0,0,1,0,0,0,0,0,1,0,0,0
                        -2.165635439,-0.362509685,-0.22656054,0,0,0,0,0,0,0,-0.44674435,-0.068217098,0,1,0,1,0,0,0,1,0,0,0,1,0,0,0,0,1,0,1,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,1,0,0,1,0,0,0,0,1,0,0,0,0,0,0,1,0,0,0,0,1,0,0,1,0,0
                        0.140333592,-0.362509685,-0.22656054,0,0,0,0,0,0,1,0.194746205,-0.068217098,0,1,0,1,0,0,0,0,0,0,1,1,0,0,0,1,0,0,1,0,0,0,0,0,0,0,0,1,0,0,0,0,1,0,0,0,0,1,0,0,0,0,0,0,0,1,1,0,0,0,0,1,0,0,0,0,0,1,0,0,1
                        1.027244757,-0.362509685,-0.22656054,0,0,0,0,0,0,0,-0.985713584,-0.068217098,0,0,1,1,0,0,0,1,0,0,0,1,0,0,0,0,1,0,0,0,0,0,1,0,1,0,0,0,0,0,0,1,0,0,0,0,0,0,1,0,0,0,0,1,0,0,0,0,0,1,0,0,0,0,1,0,0,0,1,0,1
                        1.027244757,-0.362509685,-0.22656054,0,0,0,0,0,0,1,-0.326647945,-0.068217098,0,1,0,0,1,0,0,0,0,0,1,0,0,1,0,0,1,0,0,0,0,0,1,0,0,0,0,1,0,0,0,0,1,0,0,0,0,0,1,0,0,0,0,1,0,0,0,1,0,0,0,0,1,0,0,0,0,0,1,0,0
                        -2.165635439,-0.362509685,-0.22656054,0,0,0,0,0,0,0,-0.973996862,-0.068217098,0,1,0,1,0,0,0,0,0,0,1,0,0,1,0,1,0,0,1,0,0,0,0,0,1,0,0,0,0,0,0,1,0,0,0,0,0,0,1,0,0,0,0,0,1,0,0,0,1,0,0,0,0,0,1,0,0,0,0,1,1
                        -0.923959807,1.77708902,-0.22656054,0,0,1,0,0,0,1,-0.477500747,-0.068217098,0,0,1,1,0,0,0,1,0,0,0,1,0,0,0,0,1,0,0,0,0,0,1,0,0,0,0,1,0,0,0,0,1,0,0,0,0,0,1,0,0,0,0,0,0,1,1,0,0,0,0,1,0,0,0,0,0,0,1,0,0
                        1.027244757,-0.362509685,-0.22656054,0,0,0,0,0,0,1,-0.673755848,-0.068217098,0,1,0,0,1,0,0,0,0,0,1,0,0,1,0,1,0,0,1,0,0,0,0,0,0,0,0,1,0,0,0,0,1,0,0,0,0,0,1,0,0,0,1,0,0,0,0,1,0,0,0,0,1,0,0,0,0,1,0,0,0
                        -2.165635439,-0.362509685,-0.22656054,0,0,1,0,0,0,0,-0.912484069,-0.068217098,1,0,0,1,0,0,0,0,0,1,0,0,1,0,0,0,1,0,0,0,0,1,0,0,0,0,1,0,0,0,0,0,0,1,0,0,0,0,1,0,0,0,0,1,0,0,1,0,0,0,0,0,0,0,1,0,0,0,1,0,0
                        0.140333592,1.77708902,-0.22656054,0,1,0,0,0,1,0,-0.407200412,-0.050590295,0,1,0,0,0,0,1,1,0,0,0,0,0,1,0,1,0,0,0,0,0,1,0,0,1,0,0,0,0,0,0,1,0,0,0,0,0,0,1,0,0,0,0,0,1,0,0,0,1,0,0,0,0,1,0,0,0,0,1,0,0
                        0.140333592,-0.362509685,-0.22656054,0,0,0,0,0,0,1,-0.669362077,-0.068217098,0,1,0,0,0,0,1,0,0,0,1,0,1,0,0,0,1,0,0,0,0,0,1,0,0,0,0,1,0,0,0,0,1,0,0,0,0,0,1,0,0,0,1,0,0,0,1,0,0,0,0,1,0,0,0,0,0,0,1,0,0
                        0.140333592,-0.362509685,-0.14182585,0,0,0,0,0,0,0,1.366418453,-0.068217098,0,1,0,0,1,0,0,0,0,0,1,0,0,1,0,1,0,0,1,0,0,0,0,0,1,0,0,0,0,0,0,1,0,0,0,0,0,0,1,0,0,0,0,1,0,0,0,0,0,1,0,0,0,0,1,0,0,0,0,1,0
                        0.140333592,-0.362509685,-0.22656054,0,0,0,0,0,0,0,-0.878798491,-0.068217098,0,0,1,1,0,0,0,0,0,0,1,0,0,1,0,1,0,0,0,0,0,1,0,0,1,0,0,0,0,0,0,1,0,0,0,0,0,0,1,0,0,0,1,0,0,0,0,1,0,0,0,0,1,0,0,0,0,0,0,1,0
                        1.027244757,1.77708902,1.976541403,0,0,0,0,0,0,1,-0.92127161,-0.068217098,0,0,1,1,0,0,0,0,0,1,0,0,1,0,0,0,1,0,0,0,0,0,1,0,0,0,0,1,0,0,0,0,1,0,0,0,0,0,1,0,0,0,1,0,0,0,0,1,0,0,0,0,1,0,0,0,0,0,1,0,0
                        1.027244757,1.77708902,0.536051671,0,0,0,0,0,0,0,0.21525047,-0.068217098,0,0,1,1,0,0,0,0,0,0,1,0,0,1,0,0,0,1,0,0,0,1,0,0,1,0,0,0,0,0,0,1,0,0,0,0,0,0,1,0,1,0,0,0,0,0,0,0,0,1,0,0,0,0,1,0,0,0,1,0,0
                        1.027244757,-0.362509685,-0.22656054,0,0,0,0,0,0,1,-0.464319434,-0.068217098,0,1,0,0,1,0,0,0,0,0,1,0,0,1,0,1,0,0,0,0,0,1,0,0,0,0,0,1,0,0,0,0,1,0,0,0,0,0,1,0,1,0,0,0,0,0,0,1,0,0,0,0,1,0,0,0,0,1,0,0,0
                        1.027244757,-0.362509685,-0.14182585,0,1,0,1,0,0,0,-0.679614209,-0.015445615,0,0,1,0,1,0,0,0,0,0,1,0,0,1,1,0,0,0,0,1,0,0,0,0,1,0,0,0,0,0,0,1,0,0,0,0,0,0,1,0,0,0,0,1,0,0,0,0,0,1,0,0,0,0,1,0,0,1,0,0,0
                        -2.165635439,-0.362509685,-0.22656054,0,0,0,0,0,0,0,0.506703941,-0.068217098,0,0,1,0,0,0,1,0,0,0,1,0,0,1,0,1,0,0,0,0,0,0,0,1,1,0,0,0,0,0,0,1,0,0,0,0,0,0,1,0,0,0,0,1,0,0,0,0,0,1,0,0,0,0,1,0,0,0,0,1,1
                        -2.165635439,-0.362509685,-0.22656054,0,1,1,0,1,0,0,-0.54194272,-0.041478948,1,0,0,0,1,0,0,0,0,0,1,0,0,1,0,1,0,0,1,0,0,0,0,0,1,0,0,0,0,0,0,1,0,0,0,0,0,0,1,0,0,0,0,0,1,0,0,0,1,0,0,0,0,0,1,0,0,0,0,1,0
                        -2.165635439,1.77708902,-0.22656054,0,0,1,0,0,0,0,-0.046911196,-0.068217098,0,0,1,1,0,0,0,0,0,0,1,0,0,1,0,1,0,0,0,0,0,0,0,1,1,0,0,0,0,0,0,1,0,0,0,0,0,0,1,0,0,0,0,0,0,1,0,1,0,0,0,0,0,0,1,0,0,0,0,1,0
                        -0.923959807,-0.362509685,-0.22656054,0,0,0,0,0,0,1,-0.320789583,-0.068217098,0,0,1,0,0,0,1,0,0,0,1,0,1,0,0,0,1,0,0,0,0,0,0,1,0,0,0,1,0,0,0,0,1,0,0,0,0,0,1,0,1,0,0,0,0,0,1,0,0,0,0,1,0,0,0,0,0,0,1,0,0
                        1.027244757,1.77708902,-0.22656054,0,0,0,0,0,0,0,0.194746205,-0.068217098,0,0,1,0,1,0,0,0,0,1,0,1,0,0,0,1,0,0,1,0,0,0,0,0,1,0,0,0,0,0,0,1,0,0,0,0,0,0,1,0,0,0,1,0,0,0,0,0,0,1,0,0,0,0,1,0,0,0,0,1,1
                        1.027244757,-0.362509685,-0.22656054,0,0,0,0,0,0,1,0.199139976,-0.068217098,0,1,0,1,0,0,0,1,0,0,0,0,0,1,0,1,0,0,0,0,0,0,0,1,0,0,0,1,0,0,0,0,1,0,0,0,0,0,1,0,0,0,0,0,1,0,0,0,1,0,0,0,0,1,0,0,0,1,0,0,0
                        0.140333592,-0.362509685,-0.22656054,0,0,0,0,0,1,0,0.883103651,-0.068217098,0,0,1,1,0,0,0,0,0,0,1,0,0,1,0,1,0,0,1,0,0,0,0,0,1,0,0,0,0,0,0,1,0,0,0,0,0,0,1,0,0,0,0,0,1,0,0,0,0,0,1,0,0,0,1,0,0,0,1,0,0
                        -2.165635439,-0.362509685,-0.22656054,0,0,1,0,0,0,0,-0.417452544,-0.068217098,1,0,0,1,0,0,0,0,0,0,1,0,0,1,0,1,0,0,0,0,0,0,0,1,0,0,1,0,0,0,0,0,0,1,0,0,0,0,1,0,0,0,0,1,0,0,0,0,1,0,0,0,1,0,0,0,0,0,0,1,0
                        0.140333592,1.77708902,0.02764353,0,0,0,0,0,0,1,-0.300285319,-0.068217098,0,0,1,1,0,0,0,0,0,0,1,0,0,1,1,0,0,0,0,1,0,0,0,0,0,0,0,1,0,0,0,0,1,0,0,0,0,0,1,1,0,0,0,0,0,0,1,0,0,0,0,1,0,0,0,0,1,0,0,0,0
                        -2.165635439,1.77708902,0.19711291,0,0,1,0,0,0,0,-0.72794569,-0.068217098,0,1,0,1,0,0,0,0,0,1,0,0,1,0,0,1,0,0,0,0,0,1,0,0,0,0,1,0,0,0,0,0,0,1,0,0,0,0,1,0,0,0,0,0,1,0,0,0,1,0,0,0,1,0,0,0,0,0,0,1,0
                        0.140333592,1.77708902,-0.22656054,0,0,0,0,0,0,1,-0.997430306,-0.068217098,0,0,1,1,0,0,0,0,0,0,1,0,1,0,0,0,1,0,0,0,0,0,1,0,0,0,0,1,0,0,0,0,1,0,0,0,0,0,1,0,0,0,1,0,0,0,0,1,0,0,0,0,1,0,0,0,0,1,0,0,1
                        -2.165635439,-0.362509685,-0.22656054,0,0,0,0,0,0,0,-0.890515214,-0.068217098,0,0,1,1,0,0,0,0,0,1,0,1,0,0,1,0,0,0,0,1,0,0,0,0,1,0,0,0,0,0,0,1,0,0,0,0,0,0,1,1,0,0,0,0,0,0,1,0,0,0,0,0,1,0,0,0,0,1,0,0,0
                        -0.923959807,3.916687725,5.365929007,0,0,0,0,0,0,1,-0.755772905,-0.068217098,0,0,1,1,0,0,0,0,0,0,1,0,1,0,0,0,1,0,0,0,0,0,1,0,0,0,0,1,0,0,0,0,1,0,0,0,0,0,1,0,0,0,0,1,0,0,1,0,0,0,0,1,0,0,0,0,0,0,1,0,1
                        1.027244757,-0.362509685,8.585847231,0,0,0,0,0,0,0,-0.719158148,-0.068217098,0,0,1,1,0,0,0,1,0,0,0,1,0,0,0,0,1,0,0,0,0,0,1,0,1,0,0,0,0,0,0,1,0,0,0,0,0,0,1,0,0,1,0,0,0,0,0,0,0,1,0,0,0,0,1,0,0,0,1,0,0
                        0.140333592,6.056286429,-0.22656054,0,0,0,0,0,0,1,1.407426982,-0.068217098,0,1,0,1,0,0,0,0,0,0,1,0,0,1,0,1,0,0,0,0,0,0,0,1,0,0,0,1,0,0,0,0,1,0,0,0,0,0,1,0,0,0,1,0,0,0,0,1,0,0,0,1,0,0,0,0,0,1,0,0,1
                        -0.923959807,-0.362509685,-0.22656054,0,0,0,0,0,0,0,-0.924200791,-0.068217098,0,1,0,1,0,0,0,0,0,0,1,0,0,1,0,1,0,0,1,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,1,0,0,1,0,0,0,1,0,0,0,0,0,0,0,1,0,0,0,0,1,0,0,0,1,0
                        1.027244757,-0.362509685,-0.22656054,0,0,0,1,0,0,0,-0.995965716,-0.068217098,0,0,1,1,0,0,0,0,0,1,0,0,1,0,0,1,0,0,0,0,0,0,0,1,1,0,0,0,0,0,0,1,0,0,0,0,0,0,1,0,0,0,0,1,0,0,0,0,0,1,0,0,0,0,1,0,0,1,0,0,1
                        0.140333592,-0.362509685,-0.14182585,0,0,0,0,0,0,0,-0.183118094,-0.068217098,0,1,0,1,0,0,0,0,0,0,1,0,0,1,0,1,0,0,1,0,0,0,0,0,1,0,0,0,0,0,0,1,0,0,0,0,0,0,1,0,0,0,0,0,0,1,0,0,0,1,0,0,0,0,1,0,0,0,0,1,0
                        0.140333592,-0.362509685,-0.22656054,0,0,0,0,0,0,0,-0.042517425,-0.068217098,0,0,1,0,1,0,0,0,0,1,0,0,1,0,0,1,0,0,1,0,0,0,0,0,1,0,0,0,0,0,0,1,0,0,0,0,0,0,1,1,0,0,0,0,0,0,0,0,1,0,0,0,0,1,0,0,0,0,0,1,0
                        0.140333592,-0.362509685,-0.22656054,0,0,0,0,0,0,1,0.077578981,-0.068217098,0,1,0,1,0,0,0,0,0,1,0,0,1,0,0,1,0,0,1,0,0,0,0,0,0,0,0,0,0,1,0,0,0,1,0,0,0,0,1,0,0,0,0,0,0,1,1,0,0,0,0,1,0,0,0,0,0,1,0,0,1"
))
set.seed(71)    

res0 <- xgb_cv_opt(data = df_example,
                   label = y,
                   objectfun = "binary:logistic",
                   evalmetric = "auc",
                   n_folds = 3,
                   classes = 10,
                   init_points = 4,
                   n_iter = 5)

Error installing package

Hello, I'm trying to install your package and I get an error.
This is the output:

devtools::install_github("ymattu/MlBayesOpt")
Downloading GitHub repo ymattu/MlBayesOpt@master
from URL https://api.github.com/repos/ymattu/MlBayesOpt/zipball/master
Installing MlBayesOpt
Installing 1 package: xgboost
Installing package into ‘\\SRVR-FT/Redireccionamiento de carpetas/sdonikian/Documents/R/win-library/3.3’
(as ‘lib’ is unspecified)
trying URL 'https://cran.rstudio.com/bin/windows/contrib/3.3/xgboost_0.6-4.zip'
Content type 'application/zip' length 1693578 bytes (1.6 MB)
downloaded 1.6 MB

package ‘xgboost’ successfully unpacked and MD5 sums checked

The downloaded binary packages are in
	C:\Users\sdonikian\AppData\Local\Temp\RtmpqwvNhH\downloaded_packages
"C:/PROGRA~1/R/R-33~1.1/bin/x64/R" --no-site-file --no-environ --no-save --no-restore --quiet CMD  \
  INSTALL  \
  "C:/Users/sdonikian/AppData/Local/Temp/RtmpqwvNhH/devtools35030712993/ymattu-MlBayesOpt-e9054a9"  \
  --library="\\SRVR-FT/Redireccionamiento de carpetas/sdonikian/Documents/R/win-library/3.3"  \
  --install-tests 

* installing *source* package 'MlBayesOpt' ...
** R
** data
*** moving datasets to lazyload DB
** preparing package for lazy loading
** help
*** installing help indices
** building package indices
** testing if installed package can be loaded
*** arch - i386
Warning in library(pkg_name, lib.loc = lib, character.only = TRUE, logical.return = TRUE) :
  there is no package called 'MlBayesOpt'
Error: loading failed
Ejecución interrumpida
*** arch - x64
Warning in library(pkg_name, lib.loc = lib, character.only = TRUE, logical.return = TRUE) :
  there is no package called 'MlBayesOpt'
Error: loading failed
Ejecución interrumpida
ERROR: loading failed for 'i386', 'x64'
* removing '\\SRVR-FT/Redireccionamiento de carpetas/sdonikian/Documents/R/win-library/3.3/MlBayesOpt'
Error: Command failed (1)

Can you help me with this?

Regards

Error with max_depth in xgb_cv_opt

Hello, I'm getting an error with xgb_cv_opt:

Error in xgb.iter.update(fd$bst, fd$dtrain, iteration - 1, obj) : Invalid Parameter format for max_depth expect int but value='8.26801588479429' Timing stopped at: 0.9 0.08 0.982

I'm currently experimenting with your package, and this is what the call currently looks like:

opt_cv <- xgb_cv_opt(data = train_matrix,
label = y_train,
objectfun = "binary:logistic",
evalmetric = "logloss",
#eta_range = c(0.1, 1),
max_depth_range = c(5, 9),
#nrounds_range = c(100, 400),
n_folds = 5,
init_points = 6,
n_iter = 1)

Obviously, max_depth can't be a non-integer, so what's going on?

XGBoost 0.81.0.1 breaks MIBayesOpt's tests

Hello @ymattu ,

This is Tong maintaining XGBoost R-package. Recently we are planning to submit version 0.81.0.1 to CRAN.

However in the process CRAN alerts that our update breaks your test. The error message is attached. Would appreciate if you could help to check and update. Thanks!

Package: MlBayesOpt
Check: tests
New result: ERROR
Running ‘testthat.R’ [48s/48s]
Running the tests in ‘tests/testthat.R’ failed.
Complete output:
> library(testthat)
> library(MlBayesOpt)
>
> test_check("MlBayesOpt")
elapsed = 0.02 Round = 1 mtry_opt = 3.6634 min_node_size = 7.0000 Value = 0.1800
elapsed = 0.02 Round = 2 mtry_opt = 5.4408 min_node_size = 4.0000 Value = 0.1400
elapsed = 0.01 Round = 3 mtry_opt = 3.6190 min_node_size = 7.0000 Value = 0.1800
elapsed = 0.01 Round = 4 mtry_opt = 2.6933 min_node_size = 3.0000 Value = 0.2000
elapsed = 0.01 Round = 5 mtry_opt = 3.5290 min_node_size = 3.0000 Value = 0.1600
elapsed = 0.01 Round = 6 mtry_opt = 8.5781 min_node_size = 5.0000 Value = 0.1500
elapsed = 0.01 Round = 7 mtry_opt = 6.2937 min_node_size = 5.0000 Value = 0.1600
elapsed = 0.01 Round = 8 mtry_opt = 8.1154 min_node_size = 4.0000 Value = 0.1400
elapsed = 0.01 Round = 9 mtry_opt = 3.7041 min_node_size = 4.0000 Value = 0.1700
elapsed = 0.01 Round = 10 mtry_opt = 4.4780 min_node_size = 9.0000 Value = 0.1800
elapsed = 0.01 Round = 11 mtry_opt = 1.9407 min_node_size = 1.0000 Value = 0.1600
elapsed = 0.01 Round = 12 mtry_opt = 7.0937 min_node_size = 6.0000 Value = 0.1300
elapsed = 0.01 Round = 13 mtry_opt = 2.1344 min_node_size = 8.0000 Value = 0.1500
elapsed = 0.01 Round = 14 mtry_opt = 7.1353 min_node_size = 2.0000 Value = 0.1400
elapsed = 0.01 Round = 15 mtry_opt = 7.7371 min_node_size = 8.0000 Value = 0.1400
elapsed = 0.01 Round = 16 mtry_opt = 7.2140 min_node_size = 9.0000 Value = 0.1700
elapsed = 0.01 Round = 17 mtry_opt = 2.0706 min_node_size = 5.0000 Value = 0.1700
elapsed = 0.01 Round = 18 mtry_opt = 7.4475 min_node_size = 3.0000 Value = 0.1400
elapsed = 0.01 Round = 19 mtry_opt = 8.1743 min_node_size = 5.0000 Value = 0.1700
elapsed = 0.01 Round = 20 mtry_opt = 8.4158 min_node_size = 1.0000 Value = 0.1500
elapsed = 0.01 Round = 21 mtry_opt = 2.5509 min_node_size = 3.0000 Value = 0.1700

 Best Parameters Found: 
Round = 4   mtry_opt = 2.6933       min_node_size = 3.0000  Value = 0.2000 
List of 4
 $ Best_Par  : Named num [1:2] 2.69 3
  ..- attr(*, "names")= chr [1:2] "mtry_opt" "min_node_size"
 $ Best_Value: num 0.2
 $ History   :Classes 'data.table' and 'data.frame':        21 obs. of  4 variables:
  ..$ Round        : int [1:21] 1 2 3 4 5 6 7 8 9 10 ...
  ..$ mtry_opt     : num [1:21] 3.66 5.44 3.62 2.69 3.53 ...
  ..$ min_node_size: num [1:21] 7 4 7 3 3 5 5 4 4 9 ...
  ..$ Value        : num [1:21] 0.18 0.14 0.18 0.2 0.16 0.15 0.16 0.14 0.17 0.18 ...
  ..- attr(*, ".internal.selfref")=<externalptr> 
 $ Pred      :Classes 'data.table' and 'data.frame':        1 obs. of  21 variables:
  ..$ V1 : num 0.18
  ..$ V2 : num 0.14
  ..$ V3 : num 0.18
  ..$ V4 : num 0.2
  ..$ V5 : num 0.16
  ..$ V6 : num 0.15
  ..$ V7 : num 0.16
  ..$ V8 : num 0.14
  ..$ V9 : num 0.17
  ..$ V10: num 0.18
  ..$ V11: num 0.16
  ..$ V12: num 0.13
  ..$ V13: num 0.15
  ..$ V14: num 0.14
  ..$ V15: num 0.14
  ..$ V16: num 0.17
  ..$ V17: num 0.17
  ..$ V18: num 0.14
  ..$ V19: num 0.17
  ..$ V20: num 0.15
  ..$ V21: num 0.17
  ..- attr(*, ".internal.selfref")=<externalptr> 
elapsed = 0.01      Round = 1       gamma_opt = 3.3299      cost_opt = 61.5259      Value = 0.1900 
elapsed = 0.01      Round = 2       gamma_opt = 5.5515      cost_opt = 28.7558      Value = 0.2100 
elapsed = 0.01      Round = 3       gamma_opt = 3.2744      cost_opt = 70.8278      Value = 0.1700 
elapsed = 0.01      Round = 4       gamma_opt = 2.1175      cost_opt = 21.9740      Value = 0.1600 
elapsed = 0.01      Round = 5       gamma_opt = 3.1619      cost_opt = 19.3146      Value = 0.1600 
elapsed = 0.01      Round = 6       gamma_opt = 9.4727      cost_opt = 46.3378      Value = 0.1600 
elapsed = 0.01      Round = 7       gamma_opt = 6.6175      cost_opt = 41.6790      Value = 0.1400 
elapsed = 0.01      Round = 8       gamma_opt = 8.8943      cost_opt = 33.0888      Value = 0.1300 
elapsed = 0.01      Round = 9       gamma_opt = 3.3808      cost_opt = 29.9110      Value = 0.0800 
elapsed = 0.01      Round = 10      gamma_opt = 4.3481      cost_opt = 88.7062      Value = 0.1500 
elapsed = 0.01      Round = 11      gamma_opt = 1.1767      cost_opt = 5.2563       Value = 0.1300 
elapsed = 0.01      Round = 12      gamma_opt = 7.6174      cost_opt = 60.4227      Value = 0.1500 
elapsed = 0.01      Round = 13      gamma_opt = 1.4188      cost_opt = 79.6450      Value = 0.1700 
elapsed = 0.01      Round = 14      gamma_opt = 7.6693      cost_opt = 6.2103       Value = 0.0900 
elapsed = 0.01      Round = 15      gamma_opt = 8.4215      cost_opt = 78.2717      Value = 0.1300 
elapsed = 0.01      Round = 16      gamma_opt = 7.7677      cost_opt = 83.7658      Value = 0.1800 
elapsed = 0.01      Round = 17      gamma_opt = 1.3391      cost_opt = 45.6691      Value = 0.1100 
elapsed = 0.01      Round = 18      gamma_opt = 8.0596      cost_opt = 22.1903      Value = 0.1500 
elapsed = 0.01      Round = 19      gamma_opt = 8.9679      cost_opt = 46.9767      Value = 0.2000 
elapsed = 0.01      Round = 20      gamma_opt = 9.2699      cost_opt = 3.9481       Value = 0.1100 
elapsed = 0.01      Round = 21      gamma_opt = 9.0152      cost_opt = 39.2284      Value = 0.2000 

 Best Parameters Found: 
Round = 2   gamma_opt = 5.5515      cost_opt = 28.7558      Value = 0.2100 
List of 4
 $ Best_Par  : Named num [1:2] 5.55 28.76
  ..- attr(*, "names")= chr [1:2] "gamma_opt" "cost_opt"
 $ Best_Value: num 0.21
 $ History   :Classes 'data.table' and 'data.frame':        21 obs. of  4 variables:
  ..$ Round    : int [1:21] 1 2 3 4 5 6 7 8 9 10 ...
  ..$ gamma_opt: num [1:21] 3.33 5.55 3.27 2.12 3.16 ...
  ..$ cost_opt : num [1:21] 61.5 28.8 70.8 22 19.3 ...
  ..$ Value    : num [1:21] 0.19 0.21 0.17 0.16 0.16 0.16 0.14 0.13 0.08 0.15 ...
  ..- attr(*, ".internal.selfref")=<externalptr> 
 $ Pred      :Classes 'data.table' and 'data.frame':        100 obs. of  21 variables:
  ..$ V1 : Factor w/ 10 levels "0","1","2","3",..: 6 6 6 6 6 6 6 1 10 6 ...
  ..$ V2 : Factor w/ 10 levels "0","1","2","3",..: 6 6 6 6 6 6 6 1 6 6 ...
  ..$ V3 : Factor w/ 10 levels "0","1","2","3",..: 6 6 6 6 6 6 6 1 10 6 ...
  ..$ V4 : Factor w/ 10 levels "0","1","2","3",..: 6 6 6 6 6 6 6 1 6 6 ...
  ..$ V5 : Factor w/ 10 levels "0","1","2","3",..: 6 6 6 6 6 6 6 1 6 6 ...
  ..$ V6 : Factor w/ 10 levels "0","1","2","3",..: 6 6 6 6 6 6 6 1 10 6 ...
  ..$ V7 : Factor w/ 10 levels "0","1","2","3",..: 6 6 6 6 6 6 6 1 10 6 ...
  ..$ V8 : Factor w/ 10 levels "0","1","2","3",..: 6 6 6 6 6 6 6 1 10 6 ...
  ..$ V9 : Factor w/ 10 levels "0","1","2","3",..: 6 6 6 6 6 6 6 1 6 6 ...
  ..$ V10: Factor w/ 10 levels "0","1","2","3",..: 6 6 6 6 6 6 6 1 10 6 ...
  ..$ V11: Factor w/ 10 levels "0","1","2","3",..: 6 6 6 6 6 6 6 1 6 6 ...
  ..$ V12: Factor w/ 10 levels "0","1","2","3",..: 6 6 6 6 6 6 6 1 10 6 ...
  ..$ V13: Factor w/ 10 levels "0","1","2","3",..: 6 6 6 6 6 6 6 1 6 6 ...
  ..$ V14: Factor w/ 10 levels "0","1","2","3",..: 6 6 6 6 6 6 6 1 6 6 ...
  ..$ V15: Factor w/ 10 levels "0","1","2","3",..: 6 6 6 6 6 6 6 1 10 6 ...
  ..$ V16: Factor w/ 10 levels "0","1","2","3",..: 6 6 6 6 6 6 6 1 10 6 ...
  ..$ V17: Factor w/ 10 levels "0","1","2","3",..: 6 6 6 6 6 6 6 1 6 6 ...
  ..$ V18: Factor w/ 10 levels "0","1","2","3",..: 6 6 6 6 6 6 6 1 6 6 ...
  ..$ V19: Factor w/ 10 levels "0","1","2","3",..: 6 6 6 6 6 6 6 1 10 6 ...
  ..$ V20: Factor w/ 10 levels "0","1","2","3",..: 6 6 6 6 6 6 6 1 6 6 ...
  ..$ V21: Factor w/ 10 levels "0","1","2","3",..: 6 6 6 6 6 6 6 1 10 6 ...
  ..- attr(*, ".internal.selfref")=<externalptr> 
elapsed = 0.01      Round = 1       gamma_opt = 3.3299      cost_opt = 61.5259      Value = 0.1900 
elapsed = 0.01      Round = 2       gamma_opt = 5.5515      cost_opt = 28.7558      Value = 0.2300 
elapsed = 0.01      Round = 3       gamma_opt = 3.2744      cost_opt = 70.8278      Value = 0.1900 
elapsed = 0.01      Round = 4       gamma_opt = 2.1175      cost_opt = 21.9740      Value = 0.1900 
elapsed = 0.01      Round = 5       gamma_opt = 3.1619      cost_opt = 19.3146      Value = 0.1900 
elapsed = 0.01      Round = 6       gamma_opt = 9.4727      cost_opt = 46.3378      Value = 0.2200 
elapsed = 0.01      Round = 7       gamma_opt = 6.6175      cost_opt = 41.6790      Value = 0.2200 
elapsed = 0.01      Round = 8       gamma_opt = 8.8943      cost_opt = 33.0888      Value = 0.2200 
elapsed = 0.01      Round = 9       gamma_opt = 3.3808      cost_opt = 29.9110      Value = 0.1900 
elapsed = 0.01      Round = 10      gamma_opt = 4.3481      cost_opt = 88.7062      Value = 0.2300 
elapsed = 0.01      Round = 11      gamma_opt = 1.1767      cost_opt = 5.2563       Value = 0.2000 
elapsed = 0.01      Round = 12      gamma_opt = 7.6174      cost_opt = 60.4227      Value = 0.2200 
elapsed = 0.01      Round = 13      gamma_opt = 1.4188      cost_opt = 79.6450      Value = 0.1800 
elapsed = 0.01      Round = 14      gamma_opt = 7.6693      cost_opt = 6.2103       Value = 0.2200 
elapsed = 0.01      Round = 15      gamma_opt = 8.4215      cost_opt = 78.2717      Value = 0.2300 
elapsed = 0.01      Round = 16      gamma_opt = 7.7677      cost_opt = 83.7658      Value = 0.2200 
elapsed = 0.01      Round = 17      gamma_opt = 1.3391      cost_opt = 45.6691      Value = 0.1800 
elapsed = 0.01      Round = 18      gamma_opt = 8.0596      cost_opt = 22.1903      Value = 0.2200 
elapsed = 0.01      Round = 19      gamma_opt = 8.9679      cost_opt = 46.9767      Value = 0.2200 
elapsed = 0.01      Round = 20      gamma_opt = 9.2699      cost_opt = 3.9481       Value = 0.1800 
elapsed = 0.01      Round = 21      gamma_opt = 9.6352      cost_opt = 14.7148      Value = 0.2200 

 Best Parameters Found: 
Round = 2   gamma_opt = 5.5515      cost_opt = 28.7558      Value = 0.2300 
List of 4
 $ Best_Par  : Named num [1:2] 5.55 28.76
  ..- attr(*, "names")= chr [1:2] "gamma_opt" "cost_opt"
 $ Best_Value: num 0.23
 $ History   :Classes 'data.table' and 'data.frame':        21 obs. of  4 variables:
  ..$ Round    : int [1:21] 1 2 3 4 5 6 7 8 9 10 ...
  ..$ gamma_opt: num [1:21] 3.33 5.55 3.27 2.12 3.16 ...
  ..$ cost_opt : num [1:21] 61.5 28.8 70.8 22 19.3 ...
  ..$ Value    : num [1:21] 0.19 0.23 0.19 0.19 0.19 0.22 0.22 0.22 0.19 0.23 ...
  ..- attr(*, ".internal.selfref")=<externalptr> 
 $ Pred      :Classes 'data.table' and 'data.frame':        1 obs. of  21 variables:
  ..$ V1 : num 0.19
  ..$ V2 : num 0.23
  ..$ V3 : num 0.19
  ..$ V4 : num 0.19
  ..$ V5 : num 0.19
  ..$ V6 : num 0.22
  ..$ V7 : num 0.22
  ..$ V8 : num 0.22
  ..$ V9 : num 0.19
  ..$ V10: num 0.23
  ..$ V11: num 0.2
  ..$ V12: num 0.22
  ..$ V13: num 0.18
  ..$ V14: num 0.22
  ..$ V15: num 0.23
  ..$ V16: num 0.22
  ..$ V17: num 0.18
  ..$ V18: num 0.22
  ..$ V19: num 0.22
  ..$ V20: num 0.18
  ..$ V21: num 0.22
  ..- attr(*, ".internal.selfref")=<externalptr> 
elapsed = 0.02      Round = 1       eta_opt = 0.2854        max_depth_opt = 5.0000  nrounds_opt = 112.9858  subsample_opt = 0.4052  bytree_opt = 0.5438     Value = -0.3026 
elapsed = 0.01      Round = 2       eta_opt = 0.2589        max_depth_opt = 5.0000  nrounds_opt = 147.5089  subsample_opt = 0.8555  bytree_opt = 0.4354     Value = -0.1330 
elapsed = 0.01      Round = 3       eta_opt = 0.7183        max_depth_opt = 5.0000  nrounds_opt = 109.4287  subsample_opt = 0.4120  bytree_opt = 0.7854     Value = -0.0753 
elapsed = 0.01      Round = 4       eta_opt = 0.4457        max_depth_opt = 4.0000  nrounds_opt = 92.0318   subsample_opt = 0.4004  bytree_opt = 0.9258     Value = -0.0841 
elapsed = 0.01      Round = 5       eta_opt = 0.7929        max_depth_opt = 6.0000  nrounds_opt = 76.3611   subsample_opt = 0.5287  bytree_opt = 0.8673     Value = -0.0526 
elapsed = 0.01      Round = 6       eta_opt = 0.5479        max_depth_opt = 5.0000  nrounds_opt = 78.9520   subsample_opt = 0.9030  bytree_opt = 0.8784     Value = -0.0263 
elapsed = 0.01      Round = 7       eta_opt = 0.7459        max_depth_opt = 6.0000  nrounds_opt = 98.4645   subsample_opt = 0.8779  bytree_opt = 0.6732     Value = -0.0263 
elapsed = 0.01      Round = 8       eta_opt = 0.9927        max_depth_opt = 4.0000  nrounds_opt = 116.6771  subsample_opt = 0.4510  bytree_opt = 0.6461     Value = -0.0351 
elapsed = 0.01      Round = 9       eta_opt = 0.4420        max_depth_opt = 5.0000  nrounds_opt = 129.5805  subsample_opt = 0.7996  bytree_opt = 0.8865     Value = -0.0175 
elapsed = 0.01      Round = 10      eta_opt = 0.7997        max_depth_opt = 5.0000  nrounds_opt = 106.6147  subsample_opt = 0.9646  bytree_opt = 0.7630     Value = -0.0263 
elapsed = 0.01      Round = 11      eta_opt = 0.9412        max_depth_opt = 6.0000  nrounds_opt = 152.1588  subsample_opt = 0.4912  bytree_opt = 0.7928     Value = -0.0577 
elapsed = 0.01      Round = 12      eta_opt = 0.2909        max_depth_opt = 5.0000  nrounds_opt = 96.4243   subsample_opt = 0.7413  bytree_opt = 0.6119     Value = -0.0943 
elapsed = 0.01      Round = 13      eta_opt = 0.6865        max_depth_opt = 6.0000  nrounds_opt = 111.3159  subsample_opt = 0.4600  bytree_opt = 0.5622     Value = -0.1579 
elapsed = 0.01      Round = 14      eta_opt = 0.2130        max_depth_opt = 5.0000  nrounds_opt = 99.9155   subsample_opt = 0.3928  bytree_opt = 0.9956     Value = -0.1491 
elapsed = 0.01      Round = 15      eta_opt = 0.3405        max_depth_opt = 5.0000  nrounds_opt = 128.5783  subsample_opt = 0.7814  bytree_opt = 0.7801     Value = -0.0351 
elapsed = 0.01      Round = 16      eta_opt = 0.4475        max_depth_opt = 6.0000  nrounds_opt = 93.2215   subsample_opt = 0.2824  bytree_opt = 0.5279     Value = -0.3428 
elapsed = 0.01      Round = 17      eta_opt = 0.1121        max_depth_opt = 4.0000  nrounds_opt = 113.0691  subsample_opt = 0.7400  bytree_opt = 0.4776     Value = -0.1367 
elapsed = 0.01      Round = 18      eta_opt = 0.4441        max_depth_opt = 5.0000  nrounds_opt = 138.9680  subsample_opt = 0.2095  bytree_opt = 0.6869     Value = -0.5022 
elapsed = 0.01      Round = 19      eta_opt = 0.8827        max_depth_opt = 5.0000  nrounds_opt = 77.5822   subsample_opt = 0.3209  bytree_opt = 0.9544     Value = -0.1053 
elapsed = 0.01      Round = 20      eta_opt = 0.4063        max_depth_opt = 5.0000  nrounds_opt = 148.7789  subsample_opt = 0.2290  bytree_opt = 0.7593     Value = -0.3567 
elapsed = 0.01      Round = 21      eta_opt = 1.0000        max_depth_opt = 4.0000  nrounds_opt = 106.3408  subsample_opt = 0.6443  bytree_opt = 0.6353     Value = -0.0577 

 Best Parameters Found: 
Round = 9   eta_opt = 0.4420        max_depth_opt = 5.0000  nrounds_opt = 129.5805  subsample_opt = 0.7996  bytree_opt = 0.8865     Value = -0.0175 
List of 4
 $ Best_Par  : Named num [1:5] 0.442 5 129.58 0.8 0.887
  ..- attr(*, "names")= chr [1:5] "eta_opt" "max_depth_opt" "nrounds_opt" "subsample_opt" ...
 $ Best_Value: num -0.0175
 $ History   :Classes 'data.table' and 'data.frame':        21 obs. of  7 variables:
  ..$ Round        : int [1:21] 1 2 3 4 5 6 7 8 9 10 ...
  ..$ eta_opt      : num [1:21] 0.285 0.259 0.718 0.446 0.793 ...
  ..$ max_depth_opt: num [1:21] 5 5 5 4 6 5 6 4 5 5 ...
  ..$ nrounds_opt  : num [1:21] 113 147.5 109.4 92 76.4 ...
  ..$ subsample_opt: num [1:21] 0.405 0.855 0.412 0.4 0.529 ...
  ..$ bytree_opt   : num [1:21] 0.544 0.435 0.785 0.926 0.867 ...
  ..$ Value        : num [1:21] -0.3026 -0.133 -0.0753 -0.0841 -0.0526 ...
  ..- attr(*, ".internal.selfref")=<externalptr> 
 $ Pred      :Classes 'data.table' and 'data.frame':        100 obs. of  210 variables:
  ..$ V1    : num [1:100] 0 0 0 9 0 9 5 5 0 6 ...
  ..$ V2    : num [1:100] 0 0 0 9 0 9 5 5 0 6 ...
  ..$ V3    : num [1:100] 0 0 0 9 0 9 5 5 0 6 ...
  ..$ V4    : num [1:100] 0 0 0 9 0 9 5 5 0 6 ...
  ..$ V5    : num [1:100] 0 0 0 9 0 9 5 5 0 6 ...
  ..$ V6    : num [1:100] 0 0 0 9 0 9 5 5 0 6 ...
  ..$ V7    : num [1:100] 0 0 0 9 0 9 5 5 0 6 ...
  ..$ V8    : num [1:100] 0 0 0 9 0 9 5 5 0 6 ...
  ..$ V9    : num [1:100] 0 0 0 9 0 9 5 5 0 6 ...
  ..$ V10   : num [1:100] 0 0 0 9 0 9 5 5 0 6 ...
  ..$ V1.1  : num [1:100] 5 5 5 8 9 8 3 0 5 8 ...
  ..$ V2.1  : num [1:100] 5 5 5 8 9 8 3 0 5 8 ...
  ..$ V3.1  : num [1:100] 5 5 5 8 9 8 3 0 5 8 ...
  ..$ V4.1  : num [1:100] 5 5 5 8 9 8 3 0 5 8 ...
  ..$ V5.1  : num [1:100] 5 5 5 8 9 8 3 0 5 8 ...
  ..$ V6.1  : num [1:100] 5 5 5 8 9 8 3 0 5 8 ...
  ..$ V7.1  : num [1:100] 5 5 5 8 9 8 3 0 5 8 ...
  ..$ V8.1  : num [1:100] 5 5 5 8 9 8 3 0 5 8 ...
  ..$ V9.1  : num [1:100] 5 5 5 8 9 8 3 0 5 8 ...
  ..$ V10.1 : num [1:100] 5 5 5 8 9 8 3 0 5 8 ...
  ..$ V1.2  : num [1:100] 9 7 5 7 9 7 3 0 6 6 ...
  ..$ V2.2  : num [1:100] 9 7 5 7 9 7 3 0 6 6 ...
  ..$ V3.2  : num [1:100] 9 7 5 7 9 7 3 0 6 6 ...
  ..$ V4.2  : num [1:100] 9 7 5 7 9 7 3 0 6 6 ...
  ..$ V5.2  : num [1:100] 9 7 5 7 9 7 3 0 6 6 ...
  ..$ V6.2  : num [1:100] 9 7 5 7 9 7 3 0 6 6 ...
  ..$ V7.2  : num [1:100] 9 7 5 7 9 7 3 0 6 6 ...
  ..$ V8.2  : num [1:100] 9 7 5 7 9 7 3 0 6 6 ...
  ..$ V9.2  : num [1:100] 9 7 5 7 9 7 3 0 6 6 ...
  ..$ V10.2 : num [1:100] 9 7 5 7 9 7 3 0 6 6 ...
  ..$ V1.3  : num [1:100] 7 7 3 8 7 9 0 9 6 6 ...
  ..$ V2.3  : num [1:100] 7 7 3 8 7 9 0 9 6 6 ...
  ..$ V3.3  : num [1:100] 7 7 3 8 7 9 0 9 6 6 ...
  ..$ V4.3  : num [1:100] 7 7 3 8 7 9 0 9 6 6 ...
  ..$ V5.3  : num [1:100] 7 7 3 8 7 9 0 9 6 6 ...
  ..$ V6.3  : num [1:100] 7 7 3 8 7 9 0 9 6 6 ...
  ..$ V7.3  : num [1:100] 7 7 3 8 7 9 0 9 6 6 ...
  ..$ V8.3  : num [1:100] 7 7 3 8 7 9 0 9 6 6 ...
  ..$ V9.3  : num [1:100] 7 7 3 8 7 9 0 9 6 6 ...
  ..$ V10.3 : num [1:100] 7 7 3 8 7 9 0 9 6 6 ...
  ..$ V1.4  : num [1:100] 9 6 3 8 9 8 3 0 6 6 ...
  ..$ V2.4  : num [1:100] 9 6 3 8 9 8 3 0 6 6 ...
  ..$ V3.4  : num [1:100] 9 6 3 8 9 8 3 0 6 6 ...
  ..$ V4.4  : num [1:100] 9 6 3 8 9 8 3 0 6 6 ...
  ..$ V5.4  : num [1:100] 9 6 3 8 9 8 3 0 6 6 ...
  ..$ V6.4  : num [1:100] 9 6 3 8 9 8 3 0 6 6 ...
  ..$ V7.4  : num [1:100] 9 6 3 8 9 8 3 0 6 6 ...
  ..$ V8.4  : num [1:100] 9 6 3 8 9 8 3 0 6 6 ...
  ..$ V9.4  : num [1:100] 9 6 3 8 9 8 3 0 6 6 ...
  ..$ V10.4 : num [1:100] 9 6 3 8 9 8 3 0 6 6 ...
  ..$ V1.5  : num [1:100] 8 7 3 8 7 9 3 0 6 6 ...
  ..$ V2.5  : num [1:100] 8 7 3 8 7 9 3 0 6 6 ...
  ..$ V3.5  : num [1:100] 8 7 3 8 7 9 3 0 6 6 ...
  ..$ V4.5  : num [1:100] 8 7 3 8 7 9 3 0 6 6 ...
  ..$ V5.5  : num [1:100] 8 7 3 8 7 9 3 0 6 6 ...
  ..$ V6.5  : num [1:100] 8 7 3 8 7 9 3 0 6 6 ...
  ..$ V7.5  : num [1:100] 8 7 3 8 7 9 3 0 6 6 ...
  ..$ V8.5  : num [1:100] 8 7 3 8 7 9 3 0 6 6 ...
  ..$ V9.5  : num [1:100] 8 7 3 8 7 9 3 0 6 6 ...
  ..$ V10.5 : num [1:100] 8 7 3 8 7 9 3 0 6 6 ...
  ..$ V1.6  : num [1:100] 8 7 3 8 9 5 3 0 6 5 ...
  ..$ V2.6  : num [1:100] 8 7 3 8 9 5 3 0 6 5 ...
  ..$ V3.6  : num [1:100] 8 7 3 8 9 5 3 0 6 5 ...
  ..$ V4.6  : num [1:100] 8 7 3 8 9 5 3 0 6 5 ...
  ..$ V5.6  : num [1:100] 8 7 3 8 9 5 3 0 6 5 ...
  ..$ V6.6  : num [1:100] 8 7 3 8 9 5 3 0 6 5 ...
  ..$ V7.6  : num [1:100] 8 7 3 8 9 5 3 0 6 5 ...
  ..$ V8.6  : num [1:100] 8 7 3 8 9 5 3 0 6 5 ...
  ..$ V9.6  : num [1:100] 8 7 3 8 9 5 3 0 6 5 ...
  ..$ V10.6 : num [1:100] 8 7 3 8 9 5 3 0 6 5 ...
  ..$ V1.7  : num [1:100] 8 6 9 8 0 8 2 0 6 6 ...
  ..$ V2.7  : num [1:100] 8 6 9 8 0 8 2 0 6 6 ...
  ..$ V3.7  : num [1:100] 8 6 9 8 0 8 2 0 6 6 ...
  ..$ V4.7  : num [1:100] 8 6 9 8 0 8 2 0 6 6 ...
  ..$ V5.7  : num [1:100] 8 6 9 8 0 8 2 0 6 6 ...
  ..$ V6.7  : num [1:100] 8 6 9 8 0 8 2 0 6 6 ...
  ..$ V7.7  : num [1:100] 8 6 9 8 0 8 2 0 6 6 ...
  ..$ V8.7  : num [1:100] 8 6 9 8 0 8 2 0 6 6 ...
  ..$ V9.7  : num [1:100] 8 6 9 8 0 8 2 0 6 6 ...
  ..$ V10.7 : num [1:100] 8 6 9 8 0 8 2 0 6 6 ...
  ..$ V1.8  : num [1:100] 7 7 3 8 9 8 3 0 6 6 ...
  ..$ V2.8  : num [1:100] 7 7 3 8 9 8 3 0 6 6 ...
  ..$ V3.8  : num [1:100] 7 7 3 8 9 8 3 0 6 6 ...
  ..$ V4.8  : num [1:100] 7 7 3 8 9 8 3 0 6 6 ...
  ..$ V5.8  : num [1:100] 7 7 3 8 9 8 3 0 6 6 ...
  ..$ V6.8  : num [1:100] 7 7 3 8 9 8 3 0 6 6 ...
  ..$ V7.8  : num [1:100] 7 7 3 8 9 8 3 0 6 6 ...
  ..$ V8.8  : num [1:100] 7 7 3 8 9 8 3 0 6 6 ...
  ..$ V9.8  : num [1:100] 7 7 3 8 9 8 3 0 6 6 ...
  ..$ V10.8 : num [1:100] 7 7 3 8 9 8 3 0 6 6 ...
  ..$ V1.9  : num [1:100] 8 7 3 8 9 9 3 0 6 7 ...
  ..$ V2.9  : num [1:100] 8 7 3 8 9 9 3 0 6 7 ...
  ..$ V3.9  : num [1:100] 8 7 3 8 9 9 3 0 6 7 ...
  ..$ V4.9  : num [1:100] 8 7 3 8 9 9 3 0 6 7 ...
  ..$ V5.9  : num [1:100] 8 7 3 8 9 9 3 0 6 7 ...
  ..$ V6.9  : num [1:100] 8 7 3 8 9 9 3 0 6 7 ...
  ..$ V7.9  : num [1:100] 8 7 3 8 9 9 3 0 6 7 ...
  ..$ V8.9  : num [1:100] 8 7 3 8 9 9 3 0 6 7 ...
  ..$ V9.9  : num [1:100] 8 7 3 8 9 9 3 0 6 7 ...
  .. [list output truncated]
  ..- attr(*, ".internal.selfref")=<externalptr> 
elapsed = 0.04      Round = 1       eta_opt = 0.3996        max_depth_opt = 5.0000  nrounds_opt = 103.8797  subsample_opt = 0.6901  bytree_opt = 0.5783     Value = 1.0000 
elapsed = 0.05      Round = 2       eta_opt = 0.5996        max_depth_opt = 5.0000  nrounds_opt = 125.7482  subsample_opt = 0.3096  bytree_opt = 0.6693     Value = 1.0000 
elapsed = 0.03      Round = 3       eta_opt = 0.3946        max_depth_opt = 5.0000  nrounds_opt = 73.3337   subsample_opt = 0.1606  bytree_opt = 0.8845     Value = 0.1800 
elapsed = 0.04      Round = 4       eta_opt = 0.2905        max_depth_opt = 4.0000  nrounds_opt = 129.3648  subsample_opt = 0.1475  bytree_opt = 0.5431     Value = 0.1800 
elapsed = 0.04      Round = 5       eta_opt = 0.3845        max_depth_opt = 4.0000  nrounds_opt = 106.4619  subsample_opt = 0.3976  bytree_opt = 0.4083     Value = 1.0000 
elapsed = 0.04      Round = 6       eta_opt = 0.9525        max_depth_opt = 5.0000  nrounds_opt = 127.4542  subsample_opt = 0.2646  bytree_opt = 0.4167     Value = 1.0000 
elapsed = 0.04      Round = 7       eta_opt = 0.6955        max_depth_opt = 5.0000  nrounds_opt = 119.2315  subsample_opt = 0.5751  bytree_opt = 0.4965     Value = 1.0000 
elapsed = 0.03      Round = 8       eta_opt = 0.9005        max_depth_opt = 5.0000  nrounds_opt = 81.0287   subsample_opt = 0.8342  bytree_opt = 0.6838     Value = 1.0000 
elapsed = 0.03      Round = 9       eta_opt = 0.4042        max_depth_opt = 5.0000  nrounds_opt = 73.5520   subsample_opt = 0.5461  bytree_opt = 0.6483     Value = 1.0000 
elapsed = 0.05      Round = 10      eta_opt = 0.4913        max_depth_opt = 6.0000  nrounds_opt = 144.0938  subsample_opt = 0.1334  bytree_opt = 0.6559     Value = 0.1900 
elapsed = 0.03      Round = 11      eta_opt = 0.2058        max_depth_opt = 4.0000  nrounds_opt = 72.1364   subsample_opt = 0.4510  bytree_opt = 0.4659     Value = 1.0000 
elapsed = 0.03      Round = 12      eta_opt = 0.7855        max_depth_opt = 5.0000  nrounds_opt = 81.2798   subsample_opt = 0.3255  bytree_opt = 0.7891     Value = 1.0000 
elapsed = 0.05      Round = 13      eta_opt = 0.2276        max_depth_opt = 6.0000  nrounds_opt = 124.3278  subsample_opt = 0.9381  bytree_opt = 0.7298     Value = 1.0000 
elapsed = 0.04      Round = 14      eta_opt = 0.7902        max_depth_opt = 4.0000  nrounds_opt = 115.3598  subsample_opt = 0.6396  bytree_opt = 0.9333     Value = 1.0000 
elapsed = 0.06      Round = 15      eta_opt = 0.8579        max_depth_opt = 6.0000  nrounds_opt = 155.7652  subsample_opt = 0.9330  bytree_opt = 0.6380     Value = 1.0000 
elapsed = 0.05      Round = 16      eta_opt = 0.7991        max_depth_opt = 6.0000  nrounds_opt = 159.1933  subsample_opt = 0.9602  bytree_opt = 0.7328     Value = 1.0000 
elapsed = 0.04      Round = 17      eta_opt = 0.2204        max_depth_opt = 5.0000  nrounds_opt = 112.8439  subsample_opt = 0.8948  bytree_opt = 0.4939     Value = 1.0000 
elapsed = 0.04      Round = 18      eta_opt = 0.8253        max_depth_opt = 4.0000  nrounds_opt = 126.4373  subsample_opt = 0.6642  bytree_opt = 0.4461     Value = 1.0000 
elapsed = 0.05      Round = 19      eta_opt = 0.9071        max_depth_opt = 5.0000  nrounds_opt = 129.1942  subsample_opt = 0.6238  bytree_opt = 0.6919     Value = 1.0000 
elapsed = 0.03      Round = 20      eta_opt = 0.9343        max_depth_opt = 4.0000  nrounds_opt = 86.8685   subsample_opt = 0.9110  bytree_opt = 0.5663     Value = 1.0000 
── 1. Error: (unknown) (@test-xgb_opt.R#9)  ────────────────────────────────────
task 1 failed - "non-finite value supplied by optim"
1: xgb_opt(train_data = tr, train_label = y, test_data = ts, test_label = y, objectfun = "multi:softmax", 
       evalmetric = "merror", classes = 10, init_points = 20, n_iter = 1) at testthat/test-xgb_opt.R:9
2: BayesianOptimization(xgb_holdout, bounds = list(eta_opt = eta_range, max_depth_opt = max_depth_range, 
       nrounds_opt = nrounds_range, subsample_opt = subsample_range, bytree_opt = bytree_range), 
       init_points, init_grid_dt = NULL, n_iter, acq, kappa, eps, optkernel, verbose = TRUE)
3: Utility_Max(DT_bounds, GP, acq = acq, y_max = max(DT_history[, Value]), kappa = kappa, 
       eps = eps) %>% Min_Max_Inverse_Scale_Vec(., lower = DT_bounds[, Lower], upper = DT_bounds[, 
       Upper]) %>% magrittr::set_names(., DT_bounds[, Parameter]) %>% inset(., DT_bounds[Type == 
       "integer", Parameter], round(extract(., DT_bounds[Type == "integer", Parameter])))
4: eval(lhs, parent, parent)
5: eval(lhs, parent, parent)
6: Utility_Max(DT_bounds, GP, acq = acq, y_max = max(DT_history[, Value]), kappa = kappa, 
       eps = eps)
7: foreach(i = 1:nrow(Mat_tries), .combine = "rbind") %do% {
       optim_result <- optim(par = Mat_tries[i, ], fn = Utility, GP = GP, acq = acq, 
           y_max = y_max, kappa = kappa, eps = eps, method = "L-BFGS-B", lower = rep(0, 
               length(DT_bounds[, Lower])), upper = rep(1, length(DT_bounds[, Upper])), 
           control = list(maxit = 100, factr = 5e+11))
       c(optim_result$par, optim_result$value)
   } %>% data.table(.) %>% setnames(., old = names(.), new = c(DT_bounds[, Parameter], 
       "Negetive_Utility"))
8: eval(lhs, parent, parent)
9: eval(lhs, parent, parent)
10: foreach(i = 1:nrow(Mat_tries), .combine = "rbind") %do% {
       optim_result <- optim(par = Mat_tries[i, ], fn = Utility, GP = GP, acq = acq, 
           y_max = y_max, kappa = kappa, eps = eps, method = "L-BFGS-B", lower = rep(0, 
               length(DT_bounds[, Lower])), upper = rep(1, length(DT_bounds[, Upper])), 
           control = list(maxit = 100, factr = 5e+11))
       c(optim_result$par, optim_result$value)
   }
11: e$fun(obj, substitute(ex), parent.frame(), e$data)

══ testthat results  ═══════════════════════════════════════════════════════════
OK: 0 SKIPPED: 0 FAILED: 1
1. Error: (unknown) (@test-xgb_opt.R#9) 

Error: testthat unit tests failed
Execution halted

add function argument to allow for nthread > 1

Currently xgboost is only run using a single thread due to nthread = 1 in xgb_cb_opt.R and xgb_opt.R. It would be nice if the user would have the option to override this default setting.

Random Forest example fails

Hello,

I'm trying the random forest example shown on the readme and running into an error - any ideas?

> library(MlBayesOpt)
> set.seed(123)
> mod <- rf_opt(
+   train_data = iris_train,
+   train_label = iris_train$Species,
+   test_data = iris_test,
+   test_label = iris_test$Species,
+   mtry_range = c(1L, 4L)
+ )
 Error in eval(f[[2]], envir = data) : object 'trainlabel' not found Timing stopped at: 0.196 0.003 0.2
> traceback()
15: eval(f[[2]], envir = data)
14: eval(f[[2]], envir = data)
13: data.frame(eval(f[[2]], envir = data))
12: parse.formula(formula, data)
11: ranger(trainlabel ~ ., dtrain, num.trees = num_trees_opt, mtry = mtry_opt)
10: (function (num_trees_opt, mtry_opt) 
    {
        model <- ranger(trainlabel ~ ., dtrain, num.trees = num_trees_opt, 
            mtry = mtry_opt)
        t.pred <- predict(model, dat = dtest)
        Pred <- sum(diag(table(testlabel, t.pred$predictions)))/nrow(dtest)
        list(Score = Pred, Pred = Pred)
    })(num_trees_opt = 288, mtry_opt = 4)
9: do.call(what = FUN, args = as.list(This_Par))
8: system.time({
       This_Score_Pred <- do.call(what = FUN, args = as.list(This_Par))
   })
7: eval(expr, pf)
6: eval(expr, pf)
5: withVisible(eval(expr, pf))
4: evalVis(expr)
3: utils::capture.output({
       This_Time <- system.time({
           This_Score_Pred <- do.call(what = FUN, args = as.list(This_Par))
       })
   })
2: BayesianOptimization(rf_holdout, bounds = list(num_trees_opt = num_tree_range, 
       mtry_opt = mtry_range), init_points, init_grid_dt = NULL, 
       n_iter, acq, kappa, eps, verbose = TRUE)
1: rf_opt(train_data = iris_train, train_label = iris_train$Species, 
       test_data = iris_test, test_label = iris_test$Species, mtry_range = c(1L, 
           4L))

Thanks,
Chris

Recommend Projects

  • React photo React

    A declarative, efficient, and flexible JavaScript library for building user interfaces.

  • Vue.js photo Vue.js

    🖖 Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.

  • Typescript photo Typescript

    TypeScript is a superset of JavaScript that compiles to clean JavaScript output.

  • TensorFlow photo TensorFlow

    An Open Source Machine Learning Framework for Everyone

  • Django photo Django

    The Web framework for perfectionists with deadlines.

  • D3 photo D3

    Bring data to life with SVG, Canvas and HTML. 📊📈🎉

Recommend Topics

  • javascript

    JavaScript (JS) is a lightweight interpreted programming language with first-class functions.

  • web

    Some thing interesting about web. New door for the world.

  • server

    A server is a program made to process requests and deliver data to clients.

  • Machine learning

    Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.

  • Game

    Some thing interesting about game, make everyone happy.

Recommend Org

  • Facebook photo Facebook

    We are working to build community through open source technology. NB: members must have two-factor auth.

  • Microsoft photo Microsoft

    Open source projects and samples from Microsoft.

  • Google photo Google

    Google ❤️ Open Source for everyone.

  • D3 photo D3

    Data-Driven Documents codes.