11 matlab machine-learning decision-tree
我在Matlab中看到了帮助,但是他们提供了一个示例,但没有解释如何使用'classregtree'函数中的参数.任何帮助解释'classregtree'与其参数的使用将不胜感激.
Amr*_*mro 34
函数classregtree的文档页面是不言自明的......
让我们回顾一下分类树模型的一些最常见的参数:
一个完整的例子来说明这个过程:
%# load data
load carsmall
%# construct predicting attributes and target class
vars = {'MPG' 'Cylinders' 'Horsepower' 'Model_Year'};
x = [MPG Cylinders Horsepower Model_Year]; %# mixed continous/discrete data
y = cellstr(Origin); %# class labels
%# train classification decision tree
t = classregtree(x, y, 'method','classification', 'names',vars, ...
'categorical',[2 4], 'prune','off');
view(t)
%# test
yPredicted = eval(t, x);
cm = confusionmat(y,yPredicted); %# confusion matrix
N = sum(cm(:));
err = ( N-sum(diag(cm)) ) / N; %# testing error
%# prune tree to avoid overfitting
tt = prune(t, 'level',3);
view(tt)
%# predict a new unseen instance
inst = [33 4 78 NaN];
prediction = eval(tt, inst) %# pred = 'Japan'
Run Code Online (Sandbox Code Playgroud)
上述classregtree
课程已经过时,并被R2011a中的课程ClassificationTree
和RegressionTree
课程所取代(参见R2014a中的新课程fitctree
和fitrtree
新课程).
这是更新的示例,使用新的函数/类:
t = fitctree(x, y, 'PredictorNames',vars, ...
'CategoricalPredictors',{'Cylinders', 'Model_Year'}, 'Prune','off');
view(t, 'mode','graph')
y_hat = predict(t, x);
cm = confusionmat(y,y_hat);
tt = prune(t, 'Level',3);
view(tt)
predict(tt, [33 4 78 NaN])
Run Code Online (Sandbox Code Playgroud)
归档时间: |
|
查看次数: |
47845 次 |
最近记录: |