带有要优化的附加变量的回归
Posted
技术标签:
【中文标题】带有要优化的附加变量的回归【英文标题】:Regression with additional variables to optimize 【发布时间】:2019-08-31 16:17:17 【问题描述】:我得到了由 X 和 Y 点组成的数据(x_1,...x_n; y1,...y_n)
。
我想使用两个基函数将 X 拟合到 Y:max(x,mu_1)
和 min(x,mu_2)
换句话说,我想估计以下等式:
y_i = a_1*max(x_i,mu_1)+a_2*min(x_i,mu_2)
我想找到mu_1
和mu_2
以便最适合上面的情况。我的意思是 mu_1
和 mu_2
这样当我拟合 Y 到 X 时,残差平方和最小化。
或者我可以说我需要a_1
、a_2
、mu_1
、mu_2
以使上述拟合的残差平方和最小化。
我尝试执行以下操作:
我创建了两个参数 (mu_1 and mu_2)
的函数,它返回 Y 与 X 的拟合质量。然后我尝试使用 scipy.optimize.minimize
优化此函数。这是代码:
import numpy as np
from scipy.optimize import minimize
from sklearn.linear_model import LinearRegression
###Create X and Y
X = np.random.normal(10,1,size = 10000)
Y = np.random.normal(20,1,size = 10000)
###Create function that estimates quality of fit
def func(mu_1,mu_2):
### basis functions
regressor_1 = np.maximum(X,mu_1).reshape(-1,1)
regressor_2 = np.minimum(X,mu_2).reshape(-1,1)
x_train = np.hstack((regressor_1,regressor_2))
model = LinearRegression().fit(x_train,Y)
###I didnt find how to extract sum of squared residual, but I can get R
squared, so I thought that minimizing s-s-r is the same as maximizing R
squared and it is the same as minimizing -R^2
objective = model.score(x_train,Y)
return -1*objective
### Now I want to find such mu_1 and mu_2 that minimize "func"
minimum = minimize(func,0,0)
minimum.x
它不起作用。我将非常感谢任何帮助。
【问题讨论】:
【参考方案1】:这个图形拟合器使用您的功能,它似乎可以满足您的要求。
import numpy, scipy, matplotlib
import matplotlib.pyplot as plt
from scipy.optimize import curve_fit
import warnings
xData = numpy.array([1.1, 2.2, 3.3, 4.4, 5.0, 6.6, 7.7])
yData = numpy.array([1.1, 20.2, 30.3, 60.4, 50.0, 60.6, 70.7])
def func(x, a_1, a_2, mu_1, mu_2):
retArray = []
for x_i in x: # process data points individually
val = a_1*max(x_i,mu_1) + a_2*min(x_i,mu_2)
retArray.append(val)
return retArray
# turn off the curve_fit() "covariance estimation" warning
warnings.filterwarnings("ignore")
# these are the same as the scipy defaults
initialParameters = numpy.array([1.0, 1.0, 1.0, 1.0])
# curve fit the test data
fittedParameters, pcov = curve_fit(func, xData, yData, initialParameters)
modelPredictions = func(xData, *fittedParameters)
absError = modelPredictions - yData
SE = numpy.square(absError) # squared errors
MSE = numpy.mean(SE) # mean squared errors
RMSE = numpy.sqrt(MSE) # Root Mean Squared Error, RMSE
Rsquared = 1.0 - (numpy.var(absError) / numpy.var(yData))
print('Parameters:', fittedParameters)
print('RMSE:', RMSE)
print('R-squared:', Rsquared)
print()
##########################################################
# graphics output section
def ModelAndScatterPlot(graphWidth, graphHeight):
f = plt.figure(figsize=(graphWidth/100.0, graphHeight/100.0), dpi=100)
axes = f.add_subplot(111)
# first the raw data as a scatter plot
axes.plot(xData, yData, 'D')
# create data for the fitted equation plot
xModel = numpy.linspace(min(xData), max(xData))
yModel = func(xModel, *fittedParameters)
# now the model as a line plot
axes.plot(xModel, yModel)
axes.set_xlabel('X Data') # X axis data label
axes.set_ylabel('Y Data') # Y axis data label
plt.show()
plt.close('all') # clean up after using pyplot
graphWidth = 800
graphHeight = 600
ModelAndScatterPlot(graphWidth, graphHeight)
【讨论】:
以上是关于带有要优化的附加变量的回归的主要内容,如果未能解决你的问题,请参考以下文章