支持向量机回归机的推导过程
Posted xiemaycherry
tags:
篇首语:本文由小常识网(cha138.com)小编为大家整理,主要介绍了支持向量机回归机的推导过程相关的知识,希望对你有一定的参考价值。
支持向量机用于分类:硬间隔和软件间隔支持向量机。尽可能分对
支持向量机回归: 希望(f(x))与?(y)尽可能的接近。
支持向量机基本思想
英文名:support vector regression
简记:SVR
标准的线性支持向量回归模型
学习的模型:
[f(x)=w^Tx+b]
假设能容忍(f(x))与(y)之间差别绝对值(xi),这就以(f(x)=w^Tx+b)形成了一个(2xi)的间隔带,因此模型
[
min frac{1}{2}w^Tws.t -xi<=f(x_i)-y_i<=xi
]
但是上述条件太过严苛,因此增加惩罚项,
[
min frac{1}{2}w^Tw+Csum(epsilon_i+hat{epsilon}_i)s.t. egin{cases}f(x_i)-y_i<=xi+epsilon_iy_i-f(x_i)<=xi+hat{epsilon}_i\hat{epsilon}_i>=0,epsilon_i>=0
end{cases}
]
构造Lagrange函数
[
egin{aligned} L :=frac{1}{2}|omega|^{2} &+C sumleft(xi_i+xi^{prime}_i
ight)-sum_{i=1}^{N}left(eta_{i} xi_{i}+eta_{i}^{'} xi_{i}6{'}
ight) \ &+sum alpha_{i}left(y_{i}-omega^{T} x_{i}-b-varepsilon-xi_{i}
ight) \ &+sum alpha_{i}^{'}left(omega^{T} x_{i}+b-y_{i}-varepsilon-xi_{i}^{prime}
ight) end{aligned} ag{1}
]
求偏导
[
frac{partial L}{partial omega}=omega-sumleft(alpha_{i}-alpha_{i}
ight) x_{i}=0 Rightarrow omega=sumleft(alpha_{i}-alpha_{i}^{prime}
ight) x_{i} ag{2}
]
[ frac{partial L}{partial b}=sum_{i=1}^{N}left(alpha_{i}-alpha_{i}^{prime} ight)=0 ag{3} ]
[ frac{partial L}{partial xi_{i}^{prime}}=C-alpha_{i}^{'}-eta_{i}^{prime}=0 ag{4} ]
[ frac{partial L}{partial xi_{i}}=C-alpha_{i}-eta_{i}=0 ag{5} ]
将(2)-(4)带回(1),可得对偶问题
[
egin{aligned} min L(oldsymbol{alpha})=& frac{1}{2} sum_{i=1}^{N} sum_{j=1}^{N}left(alpha_{i}-alpha_{i}^{*}
ight)left(alpha_{j}-alpha_{j}^{*}
ight)leftlangle x_{i}, x_{j}
ight
angle \ &+varepsilon sum_{i=1}^{N}left(alpha_{i}+alpha_{i}^{*}
ight)-sum_{i=1}^{N} y_{i}left(alpha_{i}-alpha_{i}^{*}
ight) \ ext { s.t. } & sum_{n=1}^{N}left(alpha_{n}-alpha_{n}^{*}
ight)=0 end{aligned}
]
再将(2)带回(Y=w^Tx+b),可得线性回归模型
[
y(x)=sum_{i=1}^{N}left(alpha_{i}-alpha_{i}^{*}
ight) x_{i}^{T} x+b
]
非线性支持向量机
考虑模型
[
y=f(x)+b
]
(f(x))是非线性函数,存在一个由(X)所在空间到希尔伯特空间的映射,使得
[
f(x)=w^Tvarphi(x)
]
因此,建立如下的优化问题
[
min frac{1}{2}|omega|^{T}+C sum_{i}left(xi_{i}+xi_{i}^{prime}
ight)\begin{cases} yleft(x_{i}
ight)-omega^{T} varphileft(x_{i}
ight)-b leq xi_{i} \ omega^{T} varphileft(x_{i}
ight)+b-yleft(x_{i}
ight) & leq xi_{i} \ xi_{i} & geq 0 \ xi_{i} & geq 0 end{cases}
]
构造lagrange函数
[
egin{aligned} L :=frac{1}{2}|omega|^{2} &+C sumleft(xi+xi^{prime}
ight)-sumleft(eta_{i} xi_{i}+eta_{i} xi_{i}^{prime}
ight) \ &+sum alpha_{i}left(y_{i}-w^{T} varphileft(x_{i}
ight)-b-varepsilon_{i}-xi_{i}
ight) \ &+sum alpha_{mathrm{i}}^{prime}left(w^{T} varphileft(x_{i}
ight)+b-y_{i}-varepsilon_{i}^{'}-xi_{i}^{prime}
ight) end{aligned}
]
求偏导
[
egin{cases}frac{partial L}{partial w}=w-sumleft(alpha_{i}-alpha_{i}
ight) varphileft(x_{i}
ight)=0 frac{partial L}{partial b} =sumleft(alpha_{i}-alpha_{i}^{prime}
ight)=0 \ frac{partial L}{partial xi_{i}^{prime}} =C-alpha_{i}^{'}-eta_{i}^{prime}=0 \ frac{partial L}{partial xi_{i}} =C-alpha_{i}-eta_{i}=0 end{cases}
]
再带回优化问题可得
[min _{t}-frac{1}{2} sumleft(alpha_{i}-alpha_{i}^{prime} ight)left(alpha_{j}-alpha_{j}^{prime} ight) varphileft(x_{i} ight)^{T} varphileft(x_{j} ight)-varepsilon sumleft(alpha_{i}+alpha_{i}^{prime} ight)+sum y_{i}left(alpha_{i}-alpha_{i}^{'} ight)\s t . sumleft(alpha_{i}-alpha_{i}^{prime} ight)=0]
再次将(w)带回模型
[
y=sumleft(alpha_{i}-alpha_{i}^{'}
ight) varphileft(x_{i}
ight)^{T} varphi(x)+b
]
以上是关于支持向量机回归机的推导过程的主要内容,如果未能解决你的问题,请参考以下文章