逻辑回归模型中的内核 LogisticRegression scikit-learn sklearn
Posted
技术标签:
【中文标题】逻辑回归模型中的内核 LogisticRegression scikit-learn sklearn【英文标题】:Kernel in a logistic regression model LogisticRegression scikit-learn sklearn 【发布时间】:2019-04-11 10:53:47 【问题描述】:如何使用 sklearn 库在逻辑回归模型中使用内核?
logreg = LogisticRegression()
logreg.fit(X_train, y_train)
y_pred = logreg.predict(X_test)
print(y_pred)
print(confusion_matrix(y_test,y_pred))
print(classification_report(y_test,y_pred))
predicted= logreg.predict(predict)
print("Accuracy:",metrics.accuracy_score(y_test, y_pred))
【问题讨论】:
希望我的回答有帮助。 【参考方案1】:非常好的问题,但scikit-learn
目前既不支持内核逻辑回归也不支持方差分析内核。
你可以实现它。
ANOVA 内核示例 1:
import numpy as np
from sklearn.metrics.pairwise import check_pairwise_arrays
from scipy.linalg import cholesky
from sklearn.linear_model import LogisticRegression
def anova_kernel(X, Y=None, gamma=None, p=1):
X, Y = check_pairwise_arrays(X, Y)
if gamma is None:
gamma = 1. / X.shape[1]
diff = X[:, None, :] - Y[None, :, :]
diff **= 2
diff *= -gamma
np.exp(diff, out=diff)
K = diff.sum(axis=2)
K **= p
return K
# Kernel matrix based on X matrix of all data points
K = anova_kernel(X)
R = cholesky(K, lower=False)
# Define the model
clf = LogisticRegression()
# Here, I assume that you have split the data and here, train are the indices for the training set
clf.fit(R[train], y_train)
preds = clf.predict(R[test])¨
Nyström 示例 2:
from sklearn.kernel_approximation import Nystroem
from sklearn.linear_model import LogisticRegression
from sklearn.pipeline import Pipeline
K_train = anova_kernel(X_train)
clf = Pipeline([
('nys', Nystroem(kernel='precomputed', n_components=100)),
('lr', LogisticRegression())])
clf.fit(K_train, y_train)
K_test = anova_kernel(X_test, X_train)
preds = clf.predict(K_test)
【讨论】:
以上是关于逻辑回归模型中的内核 LogisticRegression scikit-learn sklearn的主要内容,如果未能解决你的问题,请参考以下文章
如何将从逻辑回归模型获得的系数映射到pyspark中的特征名称
R语言广义线性模型函数GLMR中有几种logistic回归扩展和变异robust包中的glmRob函数鲁棒logistic回归ms包中的lrm函数拟合序数逻辑回归