matlab鑷甫鍚勭鍒嗙被鍣ㄧ殑浣跨敤绀轰緥
Posted
tags:
篇首语:本文由小常识网(cha138.com)小编为大家整理,主要介绍了matlab鑷甫鍚勭鍒嗙被鍣ㄧ殑浣跨敤绀轰緥相关的知识,希望对你有一定的参考价值。
鏍囩锛?a href='http://www.mamicode.com/so/1/%e5%8f%82%e8%80%83' title='鍙傝€?>鍙傝€?/a> ati tps 绀轰緥 dom forest ada 鏂规硶鎬荤粨 struct
鍏ㄦ枃杞嚦 u014114990鐨勪笓鏍?/a>锛宎ddress锛歨ttps://blog.csdn.net/u014114990/article/details/51067059
涓汉淇敼寤鸿锛?/p>
璁粌鏁版嵁鍜岄獙璇佹暟鎹洿鎺ヤ互M*N琛ㄧず锛堣琛ㄧず鐗瑰緛涓暟锛屽垪涓烘牱鏈釜鏁帮級锛岃繖鏍峰彲浠ラ伩鍏嶅悗缁殑杞疆鎿嶄綔銆?/p>
鐩墠浜嗚В鍒扮殑MATLAB涓垎绫诲櫒鏈夛細K杩戦偦鍒嗙被鍣紝闅忔満妫灄鍒嗙被鍣紝鏈寸礌璐濆彾鏂紝闆嗘垚瀛︿範鏂规硶锛岄壌鍒垎鏋愬垎绫诲櫒锛屾敮鎸佸悜閲忔満銆傜幇灏嗗叾涓昏鍑芥暟浣跨敤鏂规硶鎬荤粨濡備笅锛屾洿澶氱粏鑺傞渶鍙傝€僊ATLAB 甯姪鏂囦欢銆?br>璁?br>銆€銆€璁粌鏍锋湰锛歵rain_data % 鐭╅樀锛屾瘡琛屼竴涓牱鏈紝姣忓垪涓€涓壒寰?br>銆€銆€璁粌鏍锋湰鏍囩锛歵rain_label % 鍒楀悜閲?br>銆€銆€娴嬭瘯鏍锋湰锛歵est_data
銆€銆€娴嬭瘯鏍锋湰鏍囩锛歵est_label
K杩戦偦鍒嗙被鍣?锛圞NN锛?br>mdl = ClassificationKNN.fit(train_data,train_label,鈥楴umNeighbors鈥?1);
predict_label = predict(mdl, test_data);
accuracy = length(find(predict_label == test_label))/length(test_label)*100
闅忔満妫灄鍒嗙被鍣紙Random Forest锛?br>B = TreeBagger(nTree,train_data,train_label);
predict_label = predict(B,test_data);
鏈寸礌璐濆彾鏂?锛圢a?ve Bayes锛?br>nb = NaiveBayes.fit(train_data, train_label);
predict_label = predict(nb, test_data);
accuracy = length(find(predict_label == test_label))/length(test_label)*100;
闆嗘垚瀛︿範鏂规硶锛圗nsembles for Boosting, Bagging, or Random Subspace锛?br>ens = fitensemble(train_data,train_label,鈥楢daBoostM1鈥?,100,鈥榯ree鈥?鈥榯ype鈥?鈥榗lassification鈥?;
predict_label = predict(ens, test_data);
閴村埆鍒嗘瀽鍒嗙被鍣紙discriminant analysis classifier锛?br>obj = ClassificationDiscriminant.fit(train_data, train_label);
predict_label = predict(obj, test_data);
鏀寔鍚戦噺鏈猴紙Support Vector Machine, SVM锛?br>SVMStruct = svmtrain(train_data, train_label);
predict_label = svmclassify(SVMStruct, test_data)
銆€銆€娴嬭瘯鏍锋湰鏍囩锛歵est_label
K杩戦偦鍒嗙被鍣?锛圞NN锛?br>mdl = ClassificationKNN.fit(train_data,train_label,鈥楴umNeighbors鈥?1);
predict_label = predict(mdl, test_data);
accuracy = length(find(predict_label == test_label))/length(test_label)*100
闅忔満妫灄鍒嗙被鍣紙Random Forest锛?br>B = TreeBagger(nTree,train_data,train_label);
predict_label = predict(B,test_data);
鏈寸礌璐濆彾鏂?锛圢a?ve Bayes锛?br>nb = NaiveBayes.fit(train_data, train_label);
predict_label = predict(nb, test_data);
accuracy = length(find(predict_label == test_label))/length(test_label)*100;
闆嗘垚瀛︿範鏂规硶锛圗nsembles for Boosting, Bagging, or Random Subspace锛?br>ens = fitensemble(train_data,train_label,鈥楢daBoostM1鈥?,100,鈥榯ree鈥?鈥榯ype鈥?鈥榗lassification鈥?;
predict_label = predict(ens, test_data);
閴村埆鍒嗘瀽鍒嗙被鍣紙discriminant analysis classifier锛?br>obj = ClassificationDiscriminant.fit(train_data, train_label);
predict_label = predict(obj, test_data);
鏀寔鍚戦噺鏈猴紙Support Vector Machine, SVM锛?br>SVMStruct = svmtrain(train_data, train_label);
predict_label = svmclassify(SVMStruct, test_data)
浠g爜锛?br>clc
clear all
load(鈥榳dtFeature鈥?;
% 銆€銆€璁粌鏍锋湰锛歵rain_data % 鐭╅樀锛屾瘡琛屼竴涓牱鏈紝姣忓垪涓€涓壒寰?br>% 銆€銆€璁粌鏍锋湰鏍囩锛歵rain_label % 鍒楀悜閲?br>% 銆€銆€娴嬭瘯鏍锋湰锛歵est_data
% 銆€銆€娴嬭瘯鏍锋湰鏍囩锛歵est_label
train_data = traindata鈥?br> train_label = trainlabel鈥?br> test_data = testdata鈥?br> test_label = testlabel鈥?br>% K杩戦偦鍒嗙被鍣?锛圞NN锛?br>% mdl = ClassificationKNN.fit(train_data,train_label,鈥楴umNeighbors鈥?1);
% predict_label = predict(mdl, test_data);
% accuracy = length(find(predict_label == test_label))/length(test_label)*100
%
% 94%
% 闅忔満妫灄鍒嗙被鍣紙Random Forest锛?br>% nTree = 5
% B = TreeBagger(nTree,train_data,train_label);
% predict_label = predict(B,test_data);
%
% m=0;
% n=0;
% for i=1:50
% if predict_label{i,1}>0
% m=m+1;
% end
% if predict_label{i+50,1}<0
% n=n+1;
% end
% end
%
% s=m+n
% r=s/100
% result 50%
% **********************************************************************
% 鏈寸礌璐濆彾鏂?锛圢a?ve Bayes锛?br>% nb = NaiveBayes.fit(train_data, train_label);
% predict_label = predict(nb, test_data);
% accuracy = length(find(predict_label == test_label))/length(test_label)*100;
%
%
% % 缁撴灉 81%
% % **********************************************************************
% % 闆嗘垚瀛︿範鏂规硶锛圗nsembles for Boosting, Bagging, or Random Subspace锛?br>% ens = fitensemble(train_data,train_label,鈥楢daBoostM1鈥?,100,鈥榯ree鈥?鈥榯ype鈥?鈥榗lassification鈥?;
% predict_label = predict(ens, test_data);
%
% m=0;
% n=0;
% for i=1:50
% if predict_label(i,1)>0
% m=m+1;
% end
% if predict_label(i+50,1)<0
% n=n+1;
% end
% end
%
% s=m+n
% r=s/100
% 缁撴灉 97%
% **********************************************************************
% 閴村埆鍒嗘瀽鍒嗙被鍣紙discriminant analysis classifier锛?br>% obj = ClassificationDiscriminant.fit(train_data, train_label);
% predict_label = predict(obj, test_data);
%
% m=0;
% n=0;
% for i=1:50
% if predict_label(i,1)>0
% m=m+1;
% end
% if predict_label(i+50,1)<0
% n=n+1;
% end
% end
%
% s=m+n
% r=s/100
% result 86%
% **********************************************************************
% 鏀寔鍚戦噺鏈猴紙Support Vector Machine, SVM锛?br>SVMStruct = svmtrain(train_data, train_label);
predict_label = svmclassify(SVMStruct, test_data)
m=0;
n=0;
for i=1:50
if predict_label(i,1)>0
m=m+1;
end
if predict_label(i+50,1)<0
n=n+1;
end
end
s=m+n
r=s/100
% result 86%
clear all
load(鈥榳dtFeature鈥?;
% 銆€銆€璁粌鏍锋湰锛歵rain_data % 鐭╅樀锛屾瘡琛屼竴涓牱鏈紝姣忓垪涓€涓壒寰?br>% 銆€銆€璁粌鏍锋湰鏍囩锛歵rain_label % 鍒楀悜閲?br>% 銆€銆€娴嬭瘯鏍锋湰锛歵est_data
% 銆€銆€娴嬭瘯鏍锋湰鏍囩锛歵est_label
train_data = traindata鈥?br> train_label = trainlabel鈥?br> test_data = testdata鈥?br> test_label = testlabel鈥?br>% K杩戦偦鍒嗙被鍣?锛圞NN锛?br>% mdl = ClassificationKNN.fit(train_data,train_label,鈥楴umNeighbors鈥?1);
% predict_label = predict(mdl, test_data);
% accuracy = length(find(predict_label == test_label))/length(test_label)*100
%
% 94%
% 闅忔満妫灄鍒嗙被鍣紙Random Forest锛?br>% nTree = 5
% B = TreeBagger(nTree,train_data,train_label);
% predict_label = predict(B,test_data);
%
% m=0;
% n=0;
% for i=1:50
% if predict_label{i,1}>0
% m=m+1;
% end
% if predict_label{i+50,1}<0
% n=n+1;
% end
% end
%
% s=m+n
% r=s/100
% result 50%
% **********************************************************************
% 鏈寸礌璐濆彾鏂?锛圢a?ve Bayes锛?br>% nb = NaiveBayes.fit(train_data, train_label);
% predict_label = predict(nb, test_data);
% accuracy = length(find(predict_label == test_label))/length(test_label)*100;
%
%
% % 缁撴灉 81%
% % **********************************************************************
% % 闆嗘垚瀛︿範鏂规硶锛圗nsembles for Boosting, Bagging, or Random Subspace锛?br>% ens = fitensemble(train_data,train_label,鈥楢daBoostM1鈥?,100,鈥榯ree鈥?鈥榯ype鈥?鈥榗lassification鈥?;
% predict_label = predict(ens, test_data);
%
% m=0;
% n=0;
% for i=1:50
% if predict_label(i,1)>0
% m=m+1;
% end
% if predict_label(i+50,1)<0
% n=n+1;
% end
% end
%
% s=m+n
% r=s/100
% 缁撴灉 97%
% **********************************************************************
% 閴村埆鍒嗘瀽鍒嗙被鍣紙discriminant analysis classifier锛?br>% obj = ClassificationDiscriminant.fit(train_data, train_label);
% predict_label = predict(obj, test_data);
%
% m=0;
% n=0;
% for i=1:50
% if predict_label(i,1)>0
% m=m+1;
% end
% if predict_label(i+50,1)<0
% n=n+1;
% end
% end
%
% s=m+n
% r=s/100
% result 86%
% **********************************************************************
% 鏀寔鍚戦噺鏈猴紙Support Vector Machine, SVM锛?br>SVMStruct = svmtrain(train_data, train_label);
predict_label = svmclassify(SVMStruct, test_data)
m=0;
n=0;
for i=1:50
if predict_label(i,1)>0
m=m+1;
end
if predict_label(i+50,1)<0
n=n+1;
end
end
s=m+n
r=s/100
% result 86%
以上是关于matlab鑷甫鍚勭鍒嗙被鍣ㄧ殑浣跨敤绀轰緥的主要内容,如果未能解决你的问题,请参考以下文章