CourseMachine learning:Week 2-Lecture1-Gradient Descent For Multiple Variables
Posted ireland
tags:
篇首语:本文由小常识网(cha138.com)小编为大家整理,主要介绍了CourseMachine learning:Week 2-Lecture1-Gradient Descent For Multiple Variables相关的知识,希望对你有一定的参考价值。
Gradient Descent For Multiple Variables
问题提出:Week2的梯度下降问题由单一变量转变成了多变量:
相应的公式如下:
梯度下降算法
[
egin{array}{l}{ ext { repeat until convergence: }{} { heta_{j}:= heta_{j}-alpha frac{1}{m} sum_{i=1}^{m}left(h hetaleft(x^{(i)}
ight)-y^{(i)}
ight) cdot x_{j}^{(i)} quad ext { for } j:=0 ldots n} {}}end{array}
]
也就是:
[
egin{array}{l}{ ext { repeat until convergence: }{} { heta_{0}:= heta_{0}-alpha frac{1}{m} sum_{i=1}^{m}left(h_{ heta}left(x^{(i)}
ight)-y^{(i)}
ight) cdot x_{0}^{(i)}} { heta_{1}:= heta_{1}-alpha frac{1}{m} sum_{i=1}^{m}left(h_{ heta}left(x^{(i)}
ight)-y^{(i)}
ight) cdot x_{1}^{(i)}} { heta_{2}:= heta_{2}-alpha frac{1}{m} sum_{i=1}^{m}left(h_{ heta}left(x^{(i)}
ight)-y^{(i)}
ight) cdot x_{2}^{(i)}} {cdots} {}^{cdots}}end{array}
]
( heta_{0})、( heta_{1})、( heta_{2})...这些参数要同时更新
以上是关于CourseMachine learning:Week 2-Lecture1-Gradient Descent For Multiple Variables的主要内容,如果未能解决你的问题,请参考以下文章
How much do we need to learn to be a Self-driving Car Engineer?
(转)Predictive learning vs. representation learning 预测学习 与 表示学习
Machine Learning 学习笔记———Model and Cost Function
[Intro to Deep Learning with PyTorch -- L2 -- N15] Softmax function