public class TanhLayer extends SimpleLayer implements LearnableLayer
bias, gradientInps, gradientOuts, inps, inputPatternListeners, learnable, learning, m_batch, monitor, myLearner, outputPatternListeners, outs, running, step, STOP_FLAG| Constructor and Description |
|---|
TanhLayer()
default constructor
|
TanhLayer(java.lang.String name) |
| Modifier and Type | Method and Description |
|---|---|
void |
backward(double[] pattern)
Reverse transfer function of the component.
|
void |
forward(double[] pattern)
Transfer function to recall a result on a trained net
|
double |
getDefaultState()
Return the default state of a node in this layer, such as 0 for a tanh or 0.5 for a sigmoid layer
|
double |
getDerivative(int i)
Similar to the backward message and used by RTRL
|
double |
getFlatSpotConstant()
Gets the flat spot constant.
|
Learner |
getLearner()
Deprecated.
- Used only for backward compatibility
|
double |
getMaximumState()
Return maximum value of a node in this layer
|
double |
getMinimumState()
Return minimum value of a node in this layer
|
void |
setFlatSpotConstant(double aConstant)
Sets the constant to overcome the flat spot problem.
|
getLearningRate, getLrate, getMomentum, setDimensions, setLrate, setMomentum, setMonitoraddInputSynapse, addNoise, addOutputSynapse, adjustSizeToFwdPattern, adjustSizeToRevPattern, check, checkInputEnabled, checkInputs, checkOutputs, copyInto, finalize, fireFwdGet, fireFwdPut, fireRevGet, fireRevPut, fwdRun, getAllInputs, getAllOutputs, getBias, getDimension, getLastGradientInps, getLastGradientOuts, getLastInputs, getLastOutputs, getLayerName, getMonitor, getRows, getThreadMonitor, hasStepCounter, init, initLearner, InspectableTitle, Inspections, isInputLayer, isOutputLayer, isRunning, join, randomize, randomizeBias, randomizeWeights, removeAllInputs, removeAllOutputs, removeInputSynapse, removeListener, removeOutputSynapse, resetInputListeners, revRun, run, setAllInputs, setAllOutputs, setBias, setConnDimensions, setInputDimension, setInputSynapses, setLastInputs, setLastOutputs, setLayerName, setOutputDimension, setOutputSynapses, setRows, start, stop, sumBackInput, sumInput, toStringclone, equals, getClass, hashCode, notify, notifyAll, wait, wait, waitgetMonitor, initLearneraddInputSynapse, addNoise, addOutputSynapse, check, copyInto, getAllInputs, getAllOutputs, getBias, getLayerName, getMonitor, getRows, isRunning, removeAllInputs, removeAllOutputs, removeInputSynapse, removeOutputSynapse, setAllInputs, setAllOutputs, setBias, setLayerName, setMonitor, setRows, startpublic TanhLayer()
public TanhLayer(java.lang.String name)
public void backward(double[] pattern)
Layerbackward in class SimpleLayerpattern - input pattern on which to apply the transfer function(double[])public double getDerivative(int i)
getDerivative in class Layerpublic void forward(double[] pattern)
Layerforward in class Layerpattern - input pattern to which to apply the rtransfer function(double[])public Learner getLearner()
LayergetLearner in interface LearnablegetLearner in class LayerLearnable.getLearner()public void setFlatSpotConstant(double aConstant)
aConstant - public double getFlatSpotConstant()
public double getDefaultState()
LayergetDefaultState in class Layerpublic double getMinimumState()
LayergetMinimumState in class Layerpublic double getMaximumState()
LayergetMaximumState in class LayerSubmit Feedback to pmarrone@users.sourceforge.net