小编Thi*_* K.的帖子

如何在 XGBooost 中使用 early_stopping_rounds 参数

我正在Datacamp Extreme Gradient Boosting with XGBoost 上执行一个教程,我对一个结果有点困惑。

执行以下代码时

# Create your housing DMatrix: 
housing_dmatrix = xgb.DMatrix(data=data, label=y)

# Create the parameter dictionary for each tree: params
params = {"objective":"reg:linear", "max_depth":4}

# Perform cross-validation with early stopping: cv_results
cv_results = xgb.cv(dtrain=housing_dmatrix,params=params,nfold=3, num_boost_round=50,  early_stopping_rounds=10, metrics="rmse", as_pandas=True, seed=123)

# Print cv_results
print(cv_results)

mean_mae = cv_results['test-rmse-mean'].min()
boost_rounds = cv_results['test-rmse-mean'].idxmin()
print("\tRMSE {} for {} rounds".format(mean_mae, boost_rounds))
Run Code Online (Sandbox Code Playgroud)

我得到这个输出:

    test-rmse-mean  test-rmse-std  train-rmse-mean  train-rmse-std
0    142644.104167     705.732300    141861.109375      396.179855
1    104867.638021     109.049658    103035.130209       47.104957
2     79261.453125 …
Run Code Online (Sandbox Code Playgroud)

python-3.x cross-validation xgboost

4
推荐指数
1
解决办法
3341
查看次数

标签 统计

cross-validation ×1

python-3.x ×1

xgboost ×1