Scikit-learn 的带有线性内核 svm 的 GridSearchCV 耗时太长

Posted

技术标签:

【中文标题】Scikit-learn 的带有线性内核 svm 的 GridSearchCV 耗时太长【英文标题】:Scikit-learn's GridSearchCV with linear kernel svm takes too long 【发布时间】:2012-09-18 22:48:37 【问题描述】:

我从 sklearn 网站获取了示例代码,即

tuned_parameters = ['kernel': ['rbf'], 'gamma': [1e-3, 1e-4], 'C': [1, 10, 100, 1000],
        'kernel': ['linear'], 'C': [1, 10, 100, 1000]]

scores = [('f1', f1_score)]

for score_name, score_func in scores:
    print "# Tuning hyper-parameters for %s" % score_name
    print

    clf = GridSearchCV( SVC(), tuned_parameters, score_func=score_func, n_jobs=-1, verbose=2 )
    clf.fit(X_train, Y_train)

    print "Best parameters set found on development set:"
    print
    print clf.best_estimator_
    print
    print "Grid scores on development set:"

    print
    for params, mean_score, scores in clf.grid_scores_:
        print "%0.3f (+/-%0.03f) for %r" % (
            mean_score, scores.std() / 2, params)
    print

    print "Detailed classification report:"
    print
    print "The model is trained on the full development set."
    print "The scores are computed on the full evaluation set."
    print
    y_true, y_pred = Y_test, clf.predict(X_test)
    print cross_validation.classification_report(y_true, y_pred)
    print

X_train 是一个大约 70 行的 pandas DataFrame。

输出是

[GridSearchCV] kernel=rbf, C=1, gamma=0.001 ....................................
[GridSearchCV] kernel=rbf, C=1, gamma=0.001 ....................................
[GridSearchCV] kernel=rbf, C=1, gamma=0.001 ....................................
[GridSearchCV] kernel=rbf, C=1, gamma=0.0001 ...................................
[Parallel(n_jobs=-1)]: Done   1 jobs       | elapsed:    0.0s
[GridSearchCV] ........................... kernel=rbf, C=1, gamma=0.001 -   0.0s
[GridSearchCV] ........................... kernel=rbf, C=1, gamma=0.001 -   0.0s
[GridSearchCV] ........................... kernel=rbf, C=1, gamma=0.001 -   0.0s
[GridSearchCV] .......................... kernel=rbf, C=1, gamma=0.0001 -   0.0s
[GridSearchCV] kernel=rbf, C=1, gamma=0.0001 ...................................
[GridSearchCV] kernel=rbf, C=1, gamma=0.0001 ...................................
[GridSearchCV] kernel=rbf, C=10, gamma=0.001 ...................................
[GridSearchCV] kernel=rbf, C=10, gamma=0.001 ...................................
[GridSearchCV] .......................... kernel=rbf, C=1, gamma=0.0001 -   0.0s
[GridSearchCV] .......................... kernel=rbf, C=1, gamma=0.0001 -   0.0s
[GridSearchCV] kernel=rbf, C=10, gamma=0.001 ...................................
[GridSearchCV] .......................... kernel=rbf, C=10, gamma=0.001 -   0.0s
[GridSearchCV] .......................... kernel=rbf, C=10, gamma=0.001 -   0.0s
[GridSearchCV] kernel=rbf, C=10, gamma=0.0001 ..................................
[GridSearchCV] .......................... kernel=rbf, C=10, gamma=0.001 -   0.0s
[GridSearchCV] kernel=rbf, C=10, gamma=0.0001 ..................................
[GridSearchCV] kernel=rbf, C=10, gamma=0.0001 ..................................
[GridSearchCV] ......................... kernel=rbf, C=10, gamma=0.0001 -   0.0s
[GridSearchCV] kernel=rbf, C=100, gamma=0.001 ..................................
[GridSearchCV] ......................... kernel=rbf, C=10, gamma=0.0001 -   0.0s
[GridSearchCV] ......................... kernel=rbf, C=10, gamma=0.0001 -   0.0s
[GridSearchCV] kernel=rbf, C=100, gamma=0.001 ..................................
[GridSearchCV] ......................... kernel=rbf, C=100, gamma=0.001 -   0.0s
[GridSearchCV] kernel=rbf, C=100, gamma=0.001 ..................................
[GridSearchCV] kernel=rbf, C=100, gamma=0.0001 .................................
[GridSearchCV] ......................... kernel=rbf, C=100, gamma=0.001 -   0.0s
[GridSearchCV] kernel=rbf, C=100, gamma=0.0001 .................................
[GridSearchCV] ......................... kernel=rbf, C=100, gamma=0.001 -   0.0s
[GridSearchCV] kernel=rbf, C=100, gamma=0.0001 .................................
[GridSearchCV] kernel=rbf, C=1000, gamma=0.001 .................................
[GridSearchCV] ........................ kernel=rbf, C=100, gamma=0.0001 -   0.0s
[GridSearchCV] ........................ kernel=rbf, C=100, gamma=0.0001 -   0.0s
[GridSearchCV] kernel=rbf, C=1000, gamma=0.001 .................................
[GridSearchCV] ........................ kernel=rbf, C=100, gamma=0.0001 -   0.0s
[GridSearchCV] ........................ kernel=rbf, C=1000, gamma=0.001 -   0.0s
[GridSearchCV] kernel=rbf, C=1000, gamma=0.001 .................................
[GridSearchCV] kernel=rbf, C=1000, gamma=0.0001 ................................
[GridSearchCV] kernel=rbf, C=1000, gamma=0.0001 ................................
[GridSearchCV] ........................ kernel=rbf, C=1000, gamma=0.001 -   0.0s
[GridSearchCV] kernel=rbf, C=1000, gamma=0.0001 ................................
[GridSearchCV] ........................ kernel=rbf, C=1000, gamma=0.001 -   0.0s
[GridSearchCV] ....................... kernel=rbf, C=1000, gamma=0.0001 -   0.0s
[GridSearchCV] kernel=linear, C=1 ..............................................
[GridSearchCV] ....................... kernel=rbf, C=1000, gamma=0.0001 -   0.0s
[GridSearchCV] kernel=linear, C=1 ..............................................
[GridSearchCV] kernel=linear, C=1 ..............................................
[GridSearchCV] ....................... kernel=rbf, C=1000, gamma=0.0001 -   0.0s
[GridSearchCV] kernel=linear, C=10 .............................................

然后它永远不会结束。我用 Lion 在 Mac Book Pro 上运行它。我做错了什么?

【问题讨论】:

如果您使用n_jobs=1 运行它会完成吗? 【参考方案1】:

在运行网格搜索之前通过规范化数据集来修复它,如下所示:normalize-data-in-pandas。

【讨论】:

确实 SVC 似乎对非规范化数据非常敏感。您的数据是私有的还是可以公开的?如果你可以分享它,请在github.com/scikit-learn/scikit-learn/issues 上报告问题(只是带有触发数据冻结的参数的 SVC 调用)。邮件列表中有一些讨论在 libsvm 中添加max_iter 参数以避免此问题。 @fspirit 你是个天才

以上是关于Scikit-learn 的带有线性内核 svm 的 GridSearchCV 耗时太长的主要内容,如果未能解决你的问题,请参考以下文章

绘制scikit-learn(sklearn)SVM决策边界/表面

使用 scikit-learn python 的线性 SVM 时出现 ValueError

哪个更快?逻辑回归或线性核支持向量机?

在 sklearn 中,具有线性内核的 SVM 模型和具有 loss=hinge 的 SGD 分类器有啥区别

带有 Gram 矩阵的预计算 RBF 内核的 Python 实现?

使用 scikit-learn 线性 SVM 提取决策边界