[ML] Gradient Descend Algorithm [Octave code]
Posted KennyRom
tags:
篇首语:本文由小常识网(cha138.com)小编为大家整理,主要介绍了[ML] Gradient Descend Algorithm [Octave code]相关的知识,希望对你有一定的参考价值。
function [theta, J_history] = gradientDescentMulti(X, y, theta, alpha, num_iters) m = length(y); % number of training examples J_history = zeros(num_iters, 1); for iter = 1:num_iters theta = theta - alpha * X‘ * (X * theta - y) / m; iter = iter +1; J_history(iter) = computeCostMulti(X, y, theta); end end
以上是关于[ML] Gradient Descend Algorithm [Octave code]的主要内容,如果未能解决你的问题,请参考以下文章
随机梯度下降收敛(Stochastic gradient descent convergence)
机器学习入门系列04,Gradient Descent(梯度下降法)
Apache Spark Gradient Boosted Tree 训练运行时性能缓慢