public class SigmoidLayer extends SimpleLayer implements LearnableLayer
parent,
parent,
implemented interface,
Serialized Formbias, gradientInps, gradientOuts, inps, inputPatternListeners, learnable, learning, m_batch, monitor, myLearner, outputPatternListeners, outs, running, step, STOP_FLAG| Constructor and Description |
|---|
SigmoidLayer()
The constructor
|
SigmoidLayer(java.lang.String ElemName)
The constructor
|
| Modifier and Type | Method and Description |
|---|---|
void |
backward(double[] pattern)
Reverse transfer function of the component.
|
void |
forward(double[] pattern)
This method accepts an array of values in input and forwards it
according to the Sigmoid propagation pattern.
|
double |
getDefaultState()
Return the default state of a node in this layer, such as 0 for a tanh or 0.5 for a sigmoid layer
|
double |
getDerivative(int i)
Similar to the backward message and used by RTRL
|
double |
getFlatSpotConstant()
Gets the flat spot constant.
|
Learner |
getLearner()
Deprecated.
- Used only for backward compatibility
|
double |
getMaximumState()
Return maximum value of a node in this layer
|
double |
getMinimumState()
Return minimum value of a node in this layer
|
void |
setFlatSpotConstant(double aConstant)
Sets the constant to overcome the flat spot problem.
|
getLearningRate, getLrate, getMomentum, setDimensions, setLrate, setMomentum, setMonitoraddInputSynapse, addNoise, addOutputSynapse, adjustSizeToFwdPattern, adjustSizeToRevPattern, check, checkInputEnabled, checkInputs, checkOutputs, copyInto, finalize, fireFwdGet, fireFwdPut, fireRevGet, fireRevPut, fwdRun, getAllInputs, getAllOutputs, getBias, getDimension, getLastGradientInps, getLastGradientOuts, getLastInputs, getLastOutputs, getLayerName, getMonitor, getRows, getThreadMonitor, hasStepCounter, init, initLearner, InspectableTitle, Inspections, isInputLayer, isOutputLayer, isRunning, join, randomize, randomizeBias, randomizeWeights, removeAllInputs, removeAllOutputs, removeInputSynapse, removeListener, removeOutputSynapse, resetInputListeners, revRun, run, setAllInputs, setAllOutputs, setBias, setConnDimensions, setInputDimension, setInputSynapses, setLastInputs, setLastOutputs, setLayerName, setOutputDimension, setOutputSynapses, setRows, start, stop, sumBackInput, sumInput, toStringclone, equals, getClass, hashCode, notify, notifyAll, wait, wait, waitgetMonitor, initLearneraddInputSynapse, addNoise, addOutputSynapse, check, copyInto, getAllInputs, getAllOutputs, getBias, getLayerName, getMonitor, getRows, isRunning, removeAllInputs, removeAllOutputs, removeInputSynapse, removeOutputSynapse, setAllInputs, setAllOutputs, setBias, setLayerName, setMonitor, setRows, startpublic SigmoidLayer()
public SigmoidLayer(java.lang.String ElemName)
ElemName - The name of the Layerpublic void backward(double[] pattern)
throws JooneRuntimeException
Layerbackward in class SimpleLayerpattern - input pattern on which to apply the transfer functionJooneRuntimeExceptionpublic double getDerivative(int i)
getDerivative in class Layerpublic void forward(double[] pattern)
throws JooneRuntimeException
forward in class Layerpattern - JooneRuntimeException - This Exception is a wrapper Exception when an Exception is thrown
while doing the maths.(double[])public Learner getLearner()
LayergetLearner in interface LearnablegetLearner in class LayerLearnable.getLearner()public void setFlatSpotConstant(double aConstant)
aConstant - public double getFlatSpotConstant()
public double getDefaultState()
LayergetDefaultState in class Layerpublic double getMinimumState()
LayergetMinimumState in class Layerpublic double getMaximumState()
LayergetMaximumState in class LayerSubmit Feedback to pmarrone@users.sourceforge.net