横尾修*_*尾修平 6 python machine-learning scikit-learn cross-validation hyperopt
我使用hyperopt来搜索SVM分类器的最佳参数,但Hyperopt说最好的'内核'是'0'.{'kernel':'0'}显然不合适.
有谁知道这是由我的错误还是一袋hyperopt造成的?
代码如下.
from hyperopt import fmin, tpe, hp, rand
import numpy as np
from sklearn.metrics import accuracy_score
from sklearn import svm
from sklearn.cross_validation import StratifiedKFold
parameter_space_svc = {
'C':hp.loguniform("C", np.log(1), np.log(100)),
'kernel':hp.choice('kernel',['rbf','poly']),
'gamma': hp.loguniform("gamma", np.log(0.001), np.log(0.1)),
}
from sklearn import datasets
iris = datasets.load_digits()
train_data = iris.data
train_target = iris.target
count = 0
def function(args):
print(args)
score_avg = 0
skf = StratifiedKFold(train_target, n_folds=3, shuffle=True, random_state=1)
for train_idx, test_idx in skf:
train_X = iris.data[train_idx]
train_y = iris.target[train_idx]
test_X = iris.data[test_idx]
test_y = iris.target[test_idx]
clf = svm.SVC(**args)
clf.fit(train_X,train_y)
prediction = clf.predict(test_X)
score = accuracy_score(test_y, prediction)
score_avg += score
score_avg /= len(skf)
global count
count = count + 1
print("round %s" % str(count),score_avg)
return -score_avg
best = fmin(function, parameter_space_svc, algo=tpe.suggest, max_evals=100)
print("best estimate parameters",best)
Run Code Online (Sandbox Code Playgroud)
输出低于.
best estimate parameters {'C': 13.271912841932233, 'gamma': 0.0017394328334592358, 'kernel': 0}
Run Code Online (Sandbox Code Playgroud)
Viv*_*mar 16
首先,您sklearn.cross_validation使用的是从版本0.18开始不推荐使用的版本.所以请更新到sklearn.model_selection.
现在到主要问题,bestfrom fmin总是返回使用的参数定义的索引hp.choice.
因此,在您的情况下,'kernel':0意味着第一个值('rbf')被选为内核的最佳值.
看到这个问题证实了这一点:
要获取原始值best,请使用以下space_eval()函数:
from hyperopt import space_eval
space_eval(parameter_space_svc, best)
Output:
{'C': 13.271912841932233, 'gamma': 0.0017394328334592358, 'kernel': 'rbf'}
Run Code Online (Sandbox Code Playgroud)
| 归档时间: |
|
| 查看次数: |
1401 次 |
| 最近记录: |