二元线性回归
Posted students
tags:
篇首语:本文由小常识网(cha138.com)小编为大家整理,主要介绍了二元线性回归相关的知识,希望对你有一定的参考价值。
import numpy as np import matplotlib.pyplot as plt from mpl_toolkits.mplot3d import Axes3D data=np.genfromtxt("Delivery.csv",delimiter=‘,‘) x_data = data[:,[1,2]] y_data = data[:,[-1]] lr=0.000001 theta0=0 theta1=0 theta2=0 epochs=50 print (x_data[0]) def gredient_desent_runner(x_data,y_data,lr,theta0,theta1,theta2,epochs): m = len(x_data) for i in range(epochs): tmp_theta0=0 tmp_theta1=0 tmp_theta2=0 for j in range(m): # print () tmp_theta0+=theta0+theta1*x_data[j][0]+theta2*x_data[j][1]-y_data[j][0] tmp_theta1+=(theta0+theta1*x_data[j][0]+theta2*x_data[j][1]-y_data[j][0])*x_data[j][0] tmp_theta2+=(theta0+theta1*x_data[j][0]+theta2*x_data[j][1]-y_data[j][0])*x_data[j][1] theta0-=lr*tmp_theta0/m theta1-=lr*tmp_theta1/m theta2-=lr*tmp_theta2/m return theta0,theta1,theta2 theta0,theta1,theta2=gredient_desent_runner(x_data,y_data,lr,theta0,theta1,theta2,epochs) ax=plt.figure().add_subplot(111,projection = ‘3d‘) x0=x_data[:,0] x1=x_data[:,1] x0,x1 = np.meshgrid(x0,x1) z=theta0+theta1*x0+theta2*x1 ax.plot_surface(x0,x1,z) plt.show()
梯度下降法实现线性回归
以上是关于二元线性回归的主要内容,如果未能解决你的问题,请参考以下文章
python 二元Logistics Regression 回归分析(LogisticRegression)