Gradient Boosting

Posted RRRR-cord

tags:

篇首语:本文由小常识网(cha138.com)小编为大家整理,主要介绍了Gradient Boosting相关的知识,希望对你有一定的参考价值。

参考网址:

1. GBDT(MART) 迭代决策树入门教程 | 简介

2. Wikipedia: Gradient boosting

 


一般Gradient Boosting:

输入:训练集$\{(x_{i}, y_{i})\}_{i=1}^{n}$,可导损失函数$L(y, F(x))$,迭代次数$M$

算法:

1. 用常数值初始化模型

\[F_{0}=\argmin_{\gamma} \sum_{i=1}^{n} L(y_{i}, \gamma)\]

2. 从m=1到M:

  1)计算“伪残差”:

\[

\gamma_{im}=-[\frac{\partial L(y_{i}, F(x_{i}))}{\partial F(x_{i})}|_{F(x)=F_{m-1}(x)}], i=1, ..., n

\]

\[ \sum_{k=1}^n k^2 = \frac{1}{2} n (n+1).\]

\[ \frac{\partial u}{\partial t}
= h^2 \left( \frac{\partial^2 u}{\partial x^2}
+ \frac{\partial^2 u}{\partial y^2}
+ \frac{\partial^2 u}{\partial z^2}\right)\]

The Newton‘s second law is F=ma.

The Newton‘s second law is $F=ma$.

The Newton‘s second law is
$$F=ma$$

The Newton‘s second law is
\[F=ma\]

Greek Letters $\eta$ and $\mu$

Fraction $\frac{a}{b}$

Power $a^b$

Subscript $a_b$

Derivate $\frac{\partial y}{\partial t} $

Vector $\vec{n}$

Bold $\mathbf{n}$

To time differential $\dot{F}$

Matrix (lcr here means left, center or right for each column)
\[
\left[
\begin{array}{lcr}
a1 & b22 & c333 \\
d444 & e555555 & f6
\end{array}
\right]
\]

Equations(here \& is the symbol for aligning different rows)
\begin{align}
a+b&=c\\
d&=e+f+g
\end{align}

\[
\left\{
\begin{aligned}
&a+b=c\\
&d=e+f+g
\end{aligned}
\right.
\]

以上是关于Gradient Boosting的主要内容,如果未能解决你的问题,请参考以下文章

gradient渐变

css color之线性linear-gradient()函数

css3 gradient 渐变

gradient

Android关于shape的gradient属性详解

梯度消失(vanishing gradient)和梯度爆炸(exploding gradient)