Bex*_* T. 10 python machine-learning hyperparameters optuna
如何在objectiveOptuna 功能内同时优化多个指标。例如,我正在训练 LGBM 分类器,希望为所有常见分类指标(如 F1、精度、召回率、准确度、AUC 等)找到最佳超参数集。
def objective(trial):
# Train
gbm = lgb.train(param, dtrain)
preds = gbm.predict(X_test)
pred_labels = np.rint(preds)
# Calculate metrics
accuracy = sklearn.metrics.accuracy_score(y_test, pred_labels)
recall = metrics.recall_score(pred_labels, y_test)
precision = metrics.precision_score(pred_labels, y_test)
f1 = metrics.f1_score(pred_labels, y_test, pos_label=1)
...
Run Code Online (Sandbox Code Playgroud)
我该怎么做?
Bex*_* T. 20
定义网格并使用这些参数拟合模型并生成预测后,计算要优化的所有指标:
def objective(trial):
param_grid = {"n_estimators": trial.suggest_int("n_estimators", 2000, 10000, step=200)}
clf = lgbm.LGBMClassifier(objective='binary', **param_grid)
clf.fit(X_train, y_train)
preds = clf.predict(X_valid)
probs = clf.predict_proba(X_valid)
# Metrics
f1 = sklearn.metrics.f1_score(y_valid, press)
accuracy = ...
precision = ...
recall = ...
logloss = ...
Run Code Online (Sandbox Code Playgroud)
并按您想要的顺序返回它们:
def objective(trial):
...
return f1, logloss, accuracy, precision, recall
Run Code Online (Sandbox Code Playgroud)
然后,在研究对象中,指定是否要最小化或最大化每个指标,directions如下所示:
study = optuna.create_study(directions=['maximize', 'minimize', 'maximize', 'maximize', 'maximize'])
study.optimize(objective, n_trials=100)
Run Code Online (Sandbox Code Playgroud)
有关更多详细信息,请参阅文档中的使用 Optuna 进行多目标优化。
| 归档时间: |
|
| 查看次数: |
6899 次 |
| 最近记录: |