backpropagation

Posted cbattle

tags:

篇首语:本文由小常识网(cha138.com)小编为大家整理,主要介绍了backpropagation相关的知识,希望对你有一定的参考价值。

  The goal of backpropagation is to compute the partial derivatives ?C/?w and ?C/?b of the cost function C with respect to any weight ww or bias b in the network. 

 we use the quadratic cost function

   技术分享图片

 

two assumptions :

  1: The first assumption we need is that the cost function can be written as an average 

    技术分享图片    (case for the quadratic cost function)

    The reason we need this assumption is because what backpropagation actually lets us do is compute the partial derivatives

  ?Cx/?w and ?Cx/?b for a single training example. We then recover ?C/?w and ?C/?b by averaging over training examples. In

  fact, with this assumption in mind, we‘ll suppose the training example x has been fixed, and drop the x subscript, writing the

  cost Cx as C. We‘ll eventually put the x back in, but for now it‘s a notational nuisance that is better left implicit.

 

  2: The cost function can be written as a function of the outputs from the neural network

   技术分享图片

the Hadamard product

   (st)j=sjtj(s⊙t)j=sjtj

  技术分享图片

The four fundamental equations behind backpropagation

   技术分享图片

 

 

 

BP1 

   技术分享图片:the error in the jth neuron in the lth layer

     技术分享图片

    You might wonder why the demon is changing the weighted input zlj. Surely it‘d be more natural to imagine the demon changing

   the output activation alj, with the result that we‘d be using ?C/?alj as our measure of error. In fact, if you do this things work out quite

  similarly to the discussion below. But it turns out to make the presentation of backpropagation a little more algebraically complicated.

   So we‘ll stick with δlj=?C/?zlj as our measure of error.

   An equation for the error in the output layer, δL: The components of δL are given by

  技术分享图片

  it‘s easy to rewrite the equation in a matrix-based form, as

  技术分享图片

  技术分享图片

  技术分享图片

BP2

  技术分享图片

  

  技术分享图片

 

BP3

  技术分享图片

  技术分享图片

BP4

  技术分享图片

  技术分享图片

  技术分享图片

 reference: http://neuralnetworksanddeeplearning.com/chap2.html

------------------------------------------------------------------------------------------------

技术分享图片

技术分享图片

 

reference: Machine Learning by Andrew Ng

以上是关于backpropagation的主要内容,如果未能解决你的问题,请参考以下文章