如何修复使用optimize.minimize()时报告的值错误但引用的函数没有错误?

Posted

tags:

篇首语:本文由小常识网(cha138.com)小编为大家整理,主要介绍了如何修复使用optimize.minimize()时报告的值错误但引用的函数没有错误?相关的知识,希望对你有一定的参考价值。

为了在逻辑回归中得到Theta的最优值,我使用了optimize.minimize()函数,我的函数costFunction(X,y,theta)返回给定X,y和theta值的成本和梯度。我已经使用theta的初始值检查了我的函数costFunction()并且它工作正常。但是在optimize.minimize()中引用此函数时,它会报告值错误。

这是我的costFunction代码,我调用函数optimize.minimize()

def costFunction(X,y,theta):
    J = 0.0
    m = Y.size
    J = -1/m * np.sum(((1-y)*np.log(1-sigmoid(np.dot(X,theta))))+((y)*np.log(sigmoid(np.dot(X,theta)))))
    grad = 1/m*np.dot(X.T,(sigmoid(np.dot(X,theta))-y))
    return J, grad ```

#To check the function :
print(X[:,:3].shape)
J,grad = costFunction(X[:,:3],Y,theta=[0,0,0])
print(J)
print( grad)

#and this returns the following output:
(1000, 3)
0.6931471805599454
[ 0.      17.25682  5.92721]

#and here's where I call optimize.minimize() function:
options = {'maxiter' : 400}
initial_theta = np.zeros(3)
x = X[:,:3]
#res = optimize.minimize(computeCost,initial_theta,(X[:,:3],Y),jac = True,method = 'TNC',options = options)
res = optimize.minimize(costFunction,
                        initial_theta,
                        (x, Y),
                        jac=True,
                        method='TNC',
                        options=options)

cost = res.fun
theta = res.x
print("cost ".cost)
print("theta ".theta)
#and it returns the following error :

ValueError                                Traceback (most recent call last)
<ipython-input-69-55576d96c00a> in <module>
      8                         jac=True,
      9                         method='TNC',
---> 10                         options=options)
     11 
     12 cost = res.fun

~/anaconda3/lib/python3.7/site-packages/scipy/optimize/_minimize.py in minimize(fun, x0, args, method, jac, hess, hessp, bounds, constraints, tol, callback, options)
    604     elif meth == 'tnc':
    605         return _minimize_tnc(fun, x0, args, jac, bounds, callback=callback,
--> 606                              **options)
    607     elif meth == 'cobyla':
    608         return _minimize_cobyla(fun, x0, args, constraints, **options)

~/anaconda3/lib/python3.7/site-packages/scipy/optimize/tnc.py in _minimize_tnc(fun, x0, args, jac, bounds, eps, scale, offset, mesg_num, maxCGit, maxiter, eta, stepmx, accuracy, minfev, ftol, xtol, gtol, rescale, disp, callback, **unknown_options)
    407                                         offset, messages, maxCGit, maxfun,
    408                                         eta, stepmx, accuracy, fmin, ftol,
--> 409                                         xtol, pgtol, rescale, callback)
    410 
    411     funv, jacv = func_and_grad(x)

~/anaconda3/lib/python3.7/site-packages/scipy/optimize/tnc.py in func_and_grad(x)
    369     else:
    370         def func_and_grad(x):
--> 371             f = fun(x, *args)
    372             g = jac(x, *args)
    373             return f, g

~/anaconda3/lib/python3.7/site-packages/scipy/optimize/optimize.py in __call__(self, x, *args)
     61     def __call__(self, x, *args):
     62         self.x = numpy.asarray(x).copy()
---> 63         fg = self.fun(x, *args)
     64         self.jac = fg[1]
     65         return fg[0]

<ipython-input-65-97115ec06e6e> in costFunction(X, y, theta)
      2     J = 0.0
      3     m = Y.size
----> 4     J = -1/m * np.sum(((1-y)*np.log(1-sigmoid(np.dot(X,theta))))+((y)*np.log(sigmoid(np.dot(X,theta)))))
      5     grad = 1/m*np.dot(X.T,(sigmoid(np.dot(X,theta))-y))
      6     return J, grad

ValueError: shapes (3,) and (1000,) not aligned: 3 (dim 0) != 1000 (dim 0)```
答案

似乎错误是由optimizer.minimize()调用中的参数顺序引起的:

def costFunction(X,y,theta):
    J = 0.0
    m = y.size
    print(y.shape)
    print(X.shape)
    print(theta.shape)
    J = -1/m * np.sum(((1-y)*np.log(1-sigmoid(np.dot(X,theta))))+((y)*np.log(sigmoid(np.dot(X,theta)))))
    grad = 1/m*np.dot(X.T, (sigmoid(np.dot(X, theta))-y))
    return J, grad

这将为optimize.minimize()中的显式测试和调用打印不同的输出。原因是scipy.optimize.minimize()期望初始猜测initial_theta作为关键字参数,因此必须在给出其他参数x,Y之前给出它。既然你想优化theta,我建议你改变costFunction()中参数的顺序,并相应地改变optimize.minimize()的调用。这是一个工作示例:

from scipy import optimize
import numpy as np

def sigmoid(t):
    return 1./(1. + np.exp(t))

X = np.random.random(size=(1000,3))
Y = np.random.random(size=(1000))

def costFunction(theta, x,y):
    J = 0.0
    m = y.size
    J = -1/m * np.sum(((1-y)*np.log(1-sigmoid(np.dot(x,theta))))+((y)*np.log(sigmoid(np.dot(x,theta)))))
    grad = 1/m*np.dot(x.T, (sigmoid(np.dot(x, theta))-y))
    return J, grad

#To check the function :
print(X[:,:3].shape)
J,grad = costFunction(theta=np.asarray([0,0,0]), x=X[:,:3],y=Y)
print(J)
print( grad)

options = {'maxiter' : 400}
initial_theta = np.zeros(3)
x = X[:,:3]
res = optimize.minimize(costFunction,
                        x0 = initial_theta,
                        args=(x, Y),
                        jac=True,
                        method='TNC',
                        options=options)

cost = res.fun
thetaresult = res.x
print(cost)
print(thetaresult)

以上是关于如何修复使用optimize.minimize()时报告的值错误但引用的函数没有错误?的主要内容,如果未能解决你的问题,请参考以下文章

当您想与目标函数一起计算梯度时,如何使用 scipy.optimize.minimize 函数?

如何在 scipy.optimize.minimize 上为 Powell 方法设置正确的方向向量?

scipy.optimize.minimize 如何取损失函数的导数?

我如何矢量化矩阵/输入,以便scipy.optimize.minimize可以使用它?

使用 scipy.optimize.minimize 提前停止损失函数

`scipy.optimize.minimize` 中的 Jacobian 和 Hessian 输入