Hebbian update rule
The Hebbian update rule is a basic Hebbian type update rule. It is inspired by the memory mechanisms in biological neural networks. In our brains synapses get strengthened when strong signals pass through them and this is basically what Hebbian update does.
Hebbian update is an associative update rule, suitable for remembering patterns. A system trained with Hebbian update will output high values when presented with a familiar pattern and weak when presented with an unfamiliar one. Anti-Hebbian learning does the exact opposite: trigger a strong response on new patterns.
The big disadvantage with the Hebbian rule is that it is unstable and will always diverge over time. There are versions of the rule, such as Oja's rule that avoid that.
The update of the weights is a simple product of the inputs and the outputs:
where is the learning rate, is the input vector and is the weight vector.
The settings can be modified using the settings browser.
|Hebbian update settings|
- Apart from novelty filtering (by setting step to small a negative value) other Hebbian based are a better alternative due to the inherent instability of the plain Hebbian update rule.
- Hebbian layer - A Synapse block that uses the Hebbian learning update rule as default.
- Hebbian learning - General article on Hebbian learning.
- Synapse:GHA update rule - Generalized Hebbian Algorithm, a variation on the Hebbian rule that performs PCA.
- The talented dr Hebb part 1 - Blog tutorial on novelty filtering.
- Hebbian novelty filters for financial analysis - Blog tutorial on novelty filtering