A snippet is a partial toplogy that has been saved for reuse. Snippets can be used to rapidly construct adaptive systems with one click.
To insert an existing snippet right click on the work area (the white region in the middle of the screen). You'll get a pop-up menu from which you can choose "Insert Snippet". There you can select the snippet that you want:
To save a topology as a snippet, first select all blocks you want to include (you can select all blocks by pressing CTRL+A). Right-click on any of the selected blocks to get a context menu:
Select "Create NetSnippet..". You will get a new window, in which you can write a name, author and description:
Click on Save, and you will be asked for a location. The default user snippet directory is under My Documents/Peltarion Synapse/Snippets.
Suppose we create a new directory called "Cattle" (under My Documents/Peltarion Synapse/Snippet) and save the file in it (CattleNet.synsnip).
Back in Design, right click on the work area, and select "Insert Snippet.." Notice now that htere is a "Cattle" catgeory with a "CattleNet" snippet.
Synapse snippet library
Synapse comes with a collection of snippets of popular types of systems.
Classifiers are used to categorize data into two or more classes. They are supervised algorithms.
A simple and fast classifier that despite its simplicity is often capable of solving moderately complex classification problems. It uses the Bayesian block as the classification engine.
Support vector machine
A very powerful non-linear, binary classifier that can solve complex classification problems. It is computationally very slow and sensitive to changes in settings. It uses the support vector machine block as the classification engine.
Gamma One Layer
Dynamic neural network of intermediate complexity suitable for time series modeling and dynamic filtering. The primary memory component is the gamma layer block.
Gamma Two Layer
Dynamic neural network of high complexity suitable for time series modeling and dynamic filtering.
Gamma Recurrent Hybrid
Dynamic neural network that combines gamma and infinite impulse response memories. Suitable for simple dynamic problems that contain multifrequency dynamics.
Recurrent One Layer
Simple recurrent dynamic neural network that incorporates infinite impulse response filtering. Solutions can become divergent.
Recurrent Two Layer
Recurrent dynamic neural network that incorporates infinite impulse response filtering. Relatively powerful for dynamic problems but solutions can easily become divergent.
Expanders perform feature expansion (ie. transform categorical data from a single feature to one feature per category). The expanders use the fuzzy logic block for their operation.
Expands one binary features to two binary nominal features.  -> [1 0] and  -> [0 1].
Expands a three valued feature into three binary features.
Focused learning is a way of building systems that handle time series data while using a static control system. While not as powerful as true dynamic systems, they are considerably faster.
A dynamic neural network that can be trained with a static control system (faster).
Time Delay Neural Network
A focused neural network for solving temporal problems. Memory depth is controlled by the number of taps on the Gamma memory.
More specialized types of networks.
A radial basis function neural network . It uses gaussian activation functions whose parameters are determined through unsupervised learning. It performs more localized adaptation than other types of neural nets.
Wavelet One Layer
A basic wavelet neural network (WNN). It can outperform a regular MLP on some function modeling problems. The basic operating principle is that it makes use of wavelets which are a specialized form of functions. The wavelets are scaled, transposed and rotated and combined together to make a better function fit.
A compound wavelet network, that uses elements from both a WNN and MLP networks.
Static systems are used for data that has no temporal dependencies.
MLP One Layer
The Multi-Layer Perceptron (MLP) is a basic static feedforward backpropagation neural network. It is a good starting point for most classification and function modeling problems.
MLP Two Layer
Static feedforward backpropagation neural network of intermediate complexity. With the right configuration and data, it is theoretically capable of solving any function modeling or classification problem.
Generalized One Layer
An extension of the standard One Layer MLP neural network that in many cases is capable of solving a problem more efficiently. It is suitable for function modeling and classification tasks where plenty of data is available.
Generalized Two Layer
An extension of the standard Two Layer MLP neural network that in many cases is capable of solving a problem more efficiently. It is suitable for function modeling and classification tasks where plenty of data is available.
A static neural network that has two main branches. Generally during adaptation the branches compete against each other, often resulting in a system that is capable of better generalization.
Unsupervised learning means that no "correct" answer is provided as feedback to the system. Instead the algorithm has to make sense of the input data alone.
Anti-Hebbian Novelty Filter
Combination of two competitive components, one with projection output and the other with nearest unit output. Useful as a pre-stage to supervised networks.