Hebbian Learning Rule
Posted everyday_haoguo
tags:
篇首语:本文由小常识网(cha138.com)小编为大家整理,主要介绍了Hebbian Learning Rule相关的知识,希望对你有一定的参考价值。
Learning Rule
learning rules, for a connectionist system, are algorithms or equations which govern changes in the weights of the connections in a network. One of the simplest learning procrdures for two-layer networks is the Hebbian learning rule, which is based on a rule initially proposed by Hebb in 1949. Hebb‘s rule states that the simultaneous excitation of two neuron results in a strengthening of the connections between them. More powerful learning rules are learning rules which incorporate an error reduction procedure or error correction procedure (e.g. delta rule, generalized delta rule, back propagation). Learning rules incorporating an error reduction procedure utilize the discrepancy between the desired output and an actual output pattern to change its weights during training. The learning rule is typically applied repeatedly to the same set of training inputs across a large number of epochs or training loops with error gradually reduced across epochs as the weights are fine-tuned.
以上是关于Hebbian Learning Rule的主要内容,如果未能解决你的问题,请参考以下文章
BindsNET丨Part II. 创建和添加学习规则(Creating and Adding Learning Rules)
基于SNN脉冲神经网络的Hebbian学习训练过程matlab仿真
[Mechine Learning] Active Learning