An update rule or learning rule is an adaptation algorithm attached to a block component that modifies the weights or other free parameters of the block.
In the Block
A block can have two different update rules attached to it. The forward update rule must be compatible directly with the forward propagator while the back update rule must be compatible with the back propagator. The two update rules are activated in the forward and back passes of the control system.
The "No update" update rule is compatible with all propagators and does as the name implies nothing. It is used if no other compatible update rule is found.
Types of update rules
There are two broad categories of update rules:
- Supervised, where the update rule uses external feedback to modify the weights
- Unsupervised, where the update rule does modification based on the internal structure of the weights.
While most supervised rules are used in combination with an external error metric provider such as the delta terminator block there are instances where the block itself takes care of the feedback loop given an additional external input. The Naive Bayes block is an example of such a block.
- List of Update rule components - List of all Synapse block components.