求CART算法matlab实现
Posted
tags:
篇首语:本文由小常识网(cha138.com)小编为大家整理,主要介绍了求CART算法matlab实现相关的知识,希望对你有一定的参考价值。
不要百度上那一份只有Fuction函数的,我需要能跑的,提供主程序的实例。最好能有说明,最好没有语法错误。当然,如果能在网上那份代码基础上进行修改加上主程序也行
实在不行,c,c++,java,R语言能运行出结果的程序都行
% Classify using classification and regression trees
% Inputs:
% features - Train features
% targets - Train targets
% params - [Impurity type, Percentage of incorrectly assigned samples at a node]
% Impurity can be: Entropy, Variance (or Gini), or Missclassification
% region - Decision region vector: [-x x -y y number_of_points]
%
% Outputs
% D - Decision sufrace
[Ni, M] = size(train_features);
%Get parameters
[split_type, inc_node] = process_params(params);
%For the decision region
N = region(5);
mx = ones(N,1) * linspace (region(1),region(2),N);
my = linspace (region(3),region(4),N)' * ones(1,N);
flatxy = [mx(:), my(:)]';
%Preprocessing
[f, t, UW, m] = PCA(train_features, train_targets, Ni, region);
train_features = UW * (train_features - m*ones(1,M));;
flatxy = UW * (flatxy - m*ones(1,N^2));;
%Build the tree recursively
disp('Building tree')
tree = make_tree(train_features, train_targets, M, split_type, inc_node, region);
%Make the decision region according to the tree
disp('Building decision surface using the tree')
targets = use_tree(flatxy, 1:N^2, tree);
D = reshape(targets,N,N);
%END
function targets = use_tree(features, indices, tree)
%Classify recursively using a tree
if isnumeric(tree.Raction)
%Reached an end node
targets = zeros(1,size(features,2));
targets(indices) = tree.Raction(1);
else
%Reached a branching, so:
%Find who goes where
in_right = indices(find(eval_r(tree.Raction)));
in_left = indices(find(eval_r(tree.Laction)));
Ltargets = use_tree(features, in_left, tree.left);
Rtargets = use_tree(features, in_right, tree.right);
targets = Ltargets + Rtargets;
end
%END use_tree
function tree = make_tree(features, targets, Dlength, split_type, inc_node, region)
%Build a tree recursively
if (length(unique(targets)) == 1),
%There is only one type of targets, and this generates a warning, so deal with it separately
tree.right = [];
tree.left = [];
tree.Raction = targets(1);
tree.Laction = targets(1);
break
end
[Ni, M] = size(features);
Nt = unique(targets);
N = hist(targets, Nt);
if ((sum(N < Dlength*inc_node) == length(Nt) - 1) | (M == 1)),
%No further splitting is neccessary
tree.right = [];
tree.left = [];
if (length(Nt) ~= 1),
MLlabel = find(N == max(N));
else
MLlabel = 1;
end
tree.Raction = Nt(MLlabel);
tree.Laction = Nt(MLlabel);
else
%Split the node according to the splitting criterion
deltaI = zeros(1,Ni);
split_point = zeros(1,Ni);
op = optimset('Display', 'off');
for i = 1:Ni,
split_point(i) = fminbnd('CARTfunctions', region(i*2-1), region(i*2), op, features, targets, i, split_type);
I(i) = feval_r('CARTfunctions', split_point(i), features, targets, i, split_type);
end
[m, dim] = min(I);
loc = split_point(dim);
%So, the split is to be on dimention 'dim' at location 'loc'
indices = 1:M;
tree.Raction= ['features(' num2str(dim) ',indices) > ' num2str(loc)];
tree.Laction= ['features(' num2str(dim) ',indices) <= ' num2str(loc)];
in_right = find(eval_r(tree.Raction));
in_left = find(eval_r(tree.Laction));
if isempty(in_right) | isempty(in_left)
%No possible split found
tree.right = [];
tree.left = [];
if (length(Nt) ~= 1),
MLlabel = find(N == max(N));
else
MLlabel = 1;
end
tree.Raction = Nt(MLlabel);
tree.Laction = Nt(MLlabel);
else
%...It's possible to build new nodes
tree.right = make_tree(features(:,in_right), targets(in_right), Dlength, split_type, inc_node, region);
tree.left = make_tree(features(:,in_left), targets(in_left), Dlength, split_type, inc_node, region);
end
end追问
我不是说了别拿这一份吗,给我主程序啊
参考技术A 这个是直接用Matlab自带的数据测试的,可以直接运行%% Created by Indiffer
%数据预处理
load ionosphere;
length=size(X,1);
rng(1);%可复现
indices = crossvalind('Kfold', length, 5);%用k折分类法将样本随机分为5部分
i=1; %四份用来训练,一份进行测试
test = (indices == i);
train = ~test;
X_train=X(train, :);
Y_train=Y(train, :);
X_test=X(test, :);
Y_test=Y(test, :);
%构建CART算法分类树
tree=fitctree(X_train,Y_train);
view(tree,'Mode','graph');%生成树图
rules_num=(tree.IsBranchNode==0);
rules_num=sum(rules_num);%求取规则数量
Cart_result=predict(tree,X_test);%使用测试样本进行验证
Cart_result=cell2mat(Cart_result);
Y_test=cell2mat(Y_test);
Cart_result=(Cart_result==Y_test);
Cart_length=size(Cart_result,1);%统计准确率
Cart_rate=(sum(Cart_result))/Cart_length;
disp(['规则数:' num2str(rules_num)]);
disp(['测试样本识别准确率:' num2str(Cart_rate)]);
求MATLAB实现canopy-kmeans聚类算法的完整代码
如题,求大神给我一个完整的能实现canopy-kmeans聚类算法的代码,要MATLAB的!
小可怜真心跪求~
悬赏可追加!
求大神!
canopy聚类算法的MATLAB程序
以上是关于求CART算法matlab实现的主要内容,如果未能解决你的问题,请参考以下文章
matlab'fitctree'的CART算法考虑属性顺序为啥?