交叉验证后如何获取支持向量数
Posted
技术标签:
【中文标题】交叉验证后如何获取支持向量数【英文标题】:How to get support vector number after cross validation 【发布时间】:2016-06-01 12:39:38 【问题描述】:这是我使用非线性 SVM 进行数字分类的代码。我应用交叉验证方案来选择超参数c
和gamma
。但是,GridSearch 返回的模型没有n_support_
属性来获取支持向量的数量。
from sklearn import datasets
from sklearn.cross_validation import train_test_split
from sklearn.grid_search import GridSearchCV
from sklearn.metrics import classification_report
from sklearn.svm import SVC
from sklearn.cross_validation import ShuffleSplit
# Loading the Digits dataset
digits = datasets.load_digits()
# To apply an classifier on this data, we need to flatten the image, to
# turn the data in a (samples, feature) matrix:
n_samples = len(digits.images)
X = digits.images.reshape((n_samples, -1))
y = digits.target
# Split the dataset in two equal parts
X_train, X_test, y_train, y_test = train_test_split(
X, y, test_size=0.5, random_state=0)
#Intilize an svm estimator
estimator=SVC(kernel='rbf',C=1,gamma=1)
#Choose cross validation iterator.
cv = ShuffleSplit(X_train.shape[0], n_iter=5, test_size=0.2, random_state=0)
# Set the parameters by cross-validation
tuned_parameters = ['kernel': ['rbf'], 'gamma': [1e-3, 1e-4,1,2,10],
'C': [1, 10, 50, 100, 1000],
'kernel': ['linear'], 'C': [1, 10, 100, 1000]]
clf=GridSearchCV(estimator=estimator, cv=cv, param_grid=tuned_parameters)
#begin the cross-validation task to get the best model with best parameters.
#After this task, we get a clf as a best model with best parameters C and gamma.
clf.fit(X_train,y_train)
print()
print ("Best parameters: ")
print(clf.get_params)
print("error test set with clf1",clf.score(X_test,y_test))
print("error training set with cf1",clf.score(X_train,y_train))
#It does not work. So, how can I recuperate the number of vector support?
print ("Number of support vectors by class", clf.n_support_);
**##Here is my methods. I train a new SVM object with the best parameters and I remark that it gate the same test and train error as clf**
clf2=SVC(C=10,gamma= 0.001);
clf2.fit(X_train,y_train)
print("error test set with clf2 ",clf2.score(X_test,y_test))
print("error training set with cf1",clf.score(X_train,y_train))
print clf2.n_support_
如果我提出的方法正确,有什么意见吗?
【问题讨论】:
【参考方案1】:GridSearchCV
适合多种型号。您可以使用clf.best_estimator_
获得最好的一个,因此要在您的训练集中找到支持向量的索引,您可以使用clf.best_estimator_.n_support_
,当然len(clf.best_estimator_.n_support_)
会为您提供支持向量的数量。
你也可以分别用clf.best_params_
和clf.best_score_
得到最佳模型的参数和分数。
【讨论】:
以上是关于交叉验证后如何获取支持向量数的主要内容,如果未能解决你的问题,请参考以下文章
应用分层10折交叉验证时如何在python中获取所有混淆矩阵的聚合
MATLAB 支持向量机 (SVM) 交叉验证实现以提高代码速度