| Interface | Description |
|---|---|
| IErrorPatternListener | |
| InputPatternListener |
This interface represents an input synapse for a generic layer.
|
| Learnable | |
| LearnableLayer | |
| LearnableSynapse | |
| Learner | |
| LearnerFactory |
Learner factories are used to provide the synapses and layers, through the
monitor object with Leaners.
|
| NeuralElement |
This interface represents a generic element of a neural network
|
| NeuralLayer |
This is the interface for all the layer objects of the neural network
|
| NeuralNetListener | |
| OutputPatternListener |
This interface represents an output synapse for a generic layer.
|
| Class | Description |
|---|---|
| AbstractEventNotifier |
This class raises an event notification invoking the corrisponnding
Monitor.fireXXX method.
|
| AbstractLearner |
This class provides some basic simple functionality that can be used (extended) by other learners.
|
| BasicLearner | |
| BatchLearner |
BatchLearner stores the weight/bias changes during the batch and updates them
after the batch is done.
|
| BiasedLinearLayer |
This layer consists of linear neurons, i.e.
|
| BufferedSynapse |
This class implements a synapse that permits to have asynchronous
methods to write output patterns.
|
| CircularSpatialMap |
This class implements the SpatialMap interface providing a circular spatial map for use with the GaussianLayer and Kohonen Networks.
|
| ContextLayer |
The context layer is similar to the linear layer except that
it has an auto-recurrent connection between its output and input.
|
| DelayLayer |
Delay unit to create temporal windows from time series
O---> Yk(t-N) |
| DelayLayerBeanInfo | |
| DelaySynapse |
This Synapse connects the N input neurons with the M output neurons
using a matrix of FIRFilter elements of size NxM.
|
| DirectSynapse |
This is forward-only synapse.
|
| EKFFFNLearnerPlugin |
A plugin listener that implements the EKFFFN learner used
to train feed forward neural networks.
|
| EKFRNNLearnerPlugin |
A plugin listener that implements the EKF learner, based on
"Some observations on the use of the extended Kalman filter
as a recurrent network learning algorithm" by Williams (1992)
in order to train a network.
|
| ExtendableLearner |
Learners that extend this class are forced to implement certain functions, a
so-called skeleton.
|
| ExtendedKalmanFilterFFN |
Implements the extended Kalman filter (EKF) as described in
"Using an extended Kalman filter learning algorithm for feed-forward
neural networks to describe tracer correlations" by Lary and Mussa (2004)
in order to train a feed-forward neural network.
|
| ExtendedKalmanFilterRNN |
Implements the extended Kalman filter (EKF) as described in
"Some observations on the use of the extended Kalman filter
as a recurrent network learning algorithm" by Williams (1992)
in order to train a recurrent neural network.
|
| Fifo |
The
Fifo class represents a first-in-first-out
(FIFO) stack of objects. |
| FIRFilter |
Element of a connection representing a FIR filter (Finite Impulse Response).
|
| FreudRuleFullSynapse | Deprecated
possible bug in implementation
|
| FullSynapse | |
| GaussianLayer |
This layer implements the Gaussian Neighborhood SOM strategy.
|
| GaussianLayerBeanInfo | |
| GaussianSpatialMap |
This class implements the SpatialMap interface providing a circular spatial map for use with the GaussianLayer and Kohonen Networks.
|
| GaussLayer |
The output of a Gauss(ian) layer neuron is the sum of the weighted input values,
applied to a gaussian curve (
exp(- x * x)). |
| KohonenSynapse |
This is an unsupervised Kohonen Synapse which is a Self Organising Map.
|
| KohonenSynapseBeanInfo | |
| Layer |
The Layer object is the basic element forming the neural net.
|
| LayerBeanInfo | |
| LinearLayer |
The output of a linear layer neuron is the sum of the weighted input values,
scaled by the beta parameter.
|
| LinearLayerBeanInfo | |
| LogarithmicLayer |
This layer implements a logarithmic transfer function.
|
| Matrix |
The Matrix object represents the connection matrix of the weights of a synapse
or the biases of a layer.
|
| MatrixBeanInfo | |
| MemoryLayer | |
| MemoryLayerBeanInfo | |
| Monitor |
The Monitor object is the controller of the behavior of the neural net.
|
| MonitorBeanInfo | |
| NetErrorManager |
This class should be used when ever a critical error occurs that would impact on the training or running of the network.
|
| NetStoppedEventNotifier |
Raises the netStopped event from within a separate Thread
|
| NeuralNetAdapter | |
| NeuralNetEvent |
Transport class used to notify the events raised from a neural network
|
| OutputSwitchSynapse |
This class acts as a switch that can connect its input to one of its connected
output synapses.
|
| OutputSwitchSynapseBeanInfo | |
| Pattern |
The pattern object contains the data that must be processed from a neural net.
|
| PatternBeanInfo | |
| RbfGaussianLayer |
This class implements the nonlinear layer in Radial Basis Function (RBF)
networks using Gaussian functions.
|
| RbfGaussianParameters |
This class defines the parameters, like center, sigma, etc.
|
| RbfInputSynapse |
The synapse to the input of a radial basis function layer should't provide a
single value to every neuron in the output (RBF) layer, as is usual the case.
|
| RbfLayer |
This is the basis (helper) for radial basis function layers.
|
| RpropLearner |
This class implements the RPROP learning algorithm.
|
| RpropParameters |
This object holds the global parameters for the RPROP learning
algorithm (RpropLearner).
|
| RTRL |
A RTRL implementation.
|
| RTRLLearnerFactory |
A RTRL implementation.
|
| RTRLLearnerPlugin |
A plugin listener that applies the RTRL algorithm to a network.
|
| SangerSynapse |
This is the synapse useful to extract the principal components
from an input data set.
|
| SigmoidLayer |
The output of a sigmoid layer neuron is the sum of the weighted input values,
applied to a sigmoid function.
|
| SimpleLayer |
This abstract class represents layers that are composed
by neurons that implement some transfer function.
|
| SimpleLayerBeanInfo | |
| SineLayer |
The output of a sine layer neuron is the sum of the weighted input values,
applied to a sine (
sin(x)). |
| SoftmaxLayer |
The outputs of the Softmax layer must be interpreted as probabilities.
|
| SpatialMap |
SpatialMap is intended to be an abstract spatial map for use with a
GaussianLayer.
|
| Synapse |
The Synapse is the connection element between two Layer objects.
|
| SynapseBeanInfo | |
| TanhLayer |
Layer that applies the tangent hyperbolic transfer function
to its input patterns
|
| TanhLayerBeanInfo | |
| WTALayer |
This layer implements the Winner Takes All SOM strategy.
|
| WTALayerBeanInfo |
Submit Feedback to pmarrone@users.sourceforge.net