# doc-cache created by Octave 4.0.0
# name: cache
# type: cell
# rows: 3
# columns: 28
# name: <cell-element>
# type: sq_string
# elements: 1
# length: 8
dhardlim


# name: <cell-element>
# type: sq_string
# elements: 1
# length: 44
 -- Function File:  [A = dhardlim (N)
     


# name: <cell-element>
# type: sq_string
# elements: 1
# length: 1




# name: <cell-element>
# type: sq_string
# elements: 1
# length: 10
dividerand


# name: <cell-element>
# type: sq_string
# elements: 1
# length: 554
 -- Function File: [
          TRAINVECTORS,VALIDATIONVECTORS,TESTVECTORS,INDEXOFTRAIN,INDEXOFVALIDATION,INDEXOFTEST]
          = dividerand (ALLCASES,TRAINRATIO,VALRATIO,TESTRATIO)
     Divide the vectors in training, validation and test group
     according to the informed ratios


          [trainVectors,validationVectors,testVectors,indexOfTrain,indexOfValidatio
          n,indexOfTest] = dividerand(allCases,trainRatio,valRatio,testRatio)

          The ratios are normalized. This way:

          dividerand(xx,1,2,3) == dividerand(xx,10,20,30)



# name: <cell-element>
# type: sq_string
# elements: 1
# length: 80
Divide the vectors in training, validation and test group according to
the infor



# name: <cell-element>
# type: sq_string
# elements: 1
# length: 7
dposlin


# name: <cell-element>
# type: sq_string
# elements: 1
# length: 117
 -- Function File:  A= poslin (N)
     `poslin' is a positive linear transfer function used
 by neural
     networks


# name: <cell-element>
# type: sq_string
# elements: 1
# length: 73
`poslin' is a positive linear transfer function used
 by neural
networks



# name: <cell-element>
# type: sq_string
# elements: 1
# length: 7
dsatlin


# name: <cell-element>
# type: sq_string
# elements: 1
# length: 43
 -- Function File:  [A = dsatlin (N)
     


# name: <cell-element>
# type: sq_string
# elements: 1
# length: 1




# name: <cell-element>
# type: sq_string
# elements: 1
# length: 8
dsatlins


# name: <cell-element>
# type: sq_string
# elements: 1
# length: 105
 -- Function File:  [A = satlins (N)
     A neural feed-forward network will be trained with `trainlm'
 


# name: <cell-element>
# type: sq_string
# elements: 1
# length: 63
A neural feed-forward network will be trained with `trainlm'
 



# name: <cell-element>
# type: sq_string
# elements: 1
# length: 7
hardlim


# name: <cell-element>
# type: sq_string
# elements: 1
# length: 43
 -- Function File:  [A = hardlim (N)
     


# name: <cell-element>
# type: sq_string
# elements: 1
# length: 1




# name: <cell-element>
# type: sq_string
# elements: 1
# length: 8
hardlims


# name: <cell-element>
# type: sq_string
# elements: 1
# length: 44
 -- Function File:  [A = hardlims (N)
     


# name: <cell-element>
# type: sq_string
# elements: 1
# length: 1




# name: <cell-element>
# type: sq_string
# elements: 1
# length: 7
ind2vec


# name: <cell-element>
# type: sq_string
# elements: 1
# length: 254
 -- Function File: VEC = ind2vec (IND)
     `vec2ind' convert indices to vector

          EXAMPLE 1
          vec = [1 2 3; 4 5 6; 7 8 9];

          ind = vec2ind(vec)
          The prompt output will be:
          ans =
             1 2 3 1 2 3 1 2 3



# name: <cell-element>
# type: sq_string
# elements: 1
# length: 35
`vec2ind' convert indices to vector



# name: <cell-element>
# type: sq_string
# elements: 1
# length: 8
isposint


# name: <cell-element>
# type: sq_string
# elements: 1
# length: 305
 -- Function File:  F = isposint(N)
     `isposint' returns true for positive integer values.
 
            isposint(1)   # this returns TRUE
            isposint(0.5) # this returns FALSE
            isposint(0)   # this also return FALSE
            isposint(-1)  # this also returns FALSE
     
 


# name: <cell-element>
# type: sq_string
# elements: 1
# length: 52
`isposint' returns true for positive integer values.



# name: <cell-element>
# type: sq_string
# elements: 1
# length: 6
logsig


# name: <cell-element>
# type: sq_string
# elements: 1
# length: 226
 -- Function File:  A = logsig (N)
     `logsig' is a non-linear transfer function used to train
 neural
     networks.
 This function can be used in newff(...) to create a
     new feed forward
 multi-layer neural network.
 


# name: <cell-element>
# type: sq_string
# elements: 1
# length: 74
`logsig' is a non-linear transfer function used to train
 neural
networks.



# name: <cell-element>
# type: sq_string
# elements: 1
# length: 6
mapstd


# name: <cell-element>
# type: sq_string
# elements: 1
# length: 1510
 -- Function File: [ YY,PS] = mapstd (XX,YMEAN,YSTD)
     Map values to mean 0 and standard derivation to 1.
 
          [YY,PS] = mapstd(XX,ymean,ystd)
          
             Apply the conversion and returns YY as (YY-ymean)/ystd.
          
          [YY,PS] = mapstd(XX,FP)
          
             Apply the conversion but using an struct to inform target mean/stddev.
             This is the same of [YY,PS]=mapstd(XX,FP.ymean, FP.ystd).
          
          YY = mapstd('apply',XX,PS)
          
             Reapply the conversion based on a previous operation data.
             PS stores the mean and stddev of the first XX used.
          
          XX = mapstd('reverse',YY,PS)
          
             Reverse a conversion of a previous applied operation.
          
          dx_dy = mapstd('dx',XX,YY,PS)
          
             Returns the derivative of Y with respect to X.
          
          dx_dy = mapstd('dx',XX,[],PS)
          
             Returns the derivative (less efficient).
          
          name = mapstd('name');
          
             Returns the name of this convesion process.
          
          FP = mapstd('pdefaults');
          
             Returns the default process parameters.
          
          names = mapstd('pnames');
          
             Returns the description of the process parameters.
          
          mapstd('pcheck',FP);
          
             Raises an error if FP has some inconsistent.
     


# name: <cell-element>
# type: sq_string
# elements: 1
# length: 50
Map values to mean 0 and standard derivation to 1.



# name: <cell-element>
# type: sq_string
# elements: 1
# length: 7
min_max


# name: <cell-element>
# type: sq_string
# elements: 1
# length: 277
 -- Function File:  PR = min_max (PP)
     `min_max' returns variable Pr with range of matrix rows
 
          PR - R x 2 matrix of min and max values for R input elements
     
          Pp = [1 2 3; -1 -0.5 -3]
          pr = min_max(Pp);
          pr = [1 3; -0.5 -3];


# name: <cell-element>
# type: sq_string
# elements: 1
# length: 80
`min_max' returns variable Pr with range of matrix rows
 
     PR - R x 2 matri



# name: <cell-element>
# type: sq_string
# elements: 1
# length: 5
newff


# name: <cell-element>
# type: sq_string
# elements: 1
# length: 857
 -- Function File: NET = newff (PR,SS,TRF,BTF,BLF,PF)
     `newff' create a feed-forward backpropagation network
 
          Pr - R x 2 matrix of min and max values for R input elements
          Ss - 1 x Ni row vector with size of ith layer, for N layers
          trf - 1 x Ni list with transfer function of ith layer,
                default = "tansig"
          btf - Batch network training function,
                default = "trainlm"
          blf - Batch weight/bias learning function,
                default = "learngdm"
          pf  - Performance function,
                default = "mse".
     
          EXAMPLE 1
          Pr = [0.1 0.8; 0.1 0.75; 0.01 0.8];
               it's a 3 x 2 matrix, this means 3 input neurons
          
          net = newff(Pr, [4 1], {"tansig","purelin"}, "trainlm", "learngdm", "mse");
     


# name: <cell-element>
# type: sq_string
# elements: 1
# length: 80
`newff' create a feed-forward backpropagation network
 
     Pr - R x 2 matrix 



# name: <cell-element>
# type: sq_string
# elements: 1
# length: 4
newp


# name: <cell-element>
# type: sq_string
# elements: 1
# length: 561
 -- Function File: NET = newp (PR,SS,TRANSFUNC,LEARNFUNC)
     `newp' create a perceptron
 
          PLEASE DON'T USE THIS FUNCTIONS, IT'S STILL NOT FINISHED!
          =========================================================

          Pr - R x 2 matrix of min and max values for R input elements
          ss - a scalar value with the number of neurons
          transFunc - a string with the transfer function
                default = "hardlim"
          learnFunc - a string with the learning function
                default = "learnp"
     
 


# name: <cell-element>
# type: sq_string
# elements: 1
# length: 80
`newp' create a perceptron
 
     PLEASE DON'T USE THIS FUNCTIONS, IT'S STILL N



# name: <cell-element>
# type: sq_string
# elements: 1
# length: 6
poslin


# name: <cell-element>
# type: sq_string
# elements: 1
# length: 117
 -- Function File:  A= poslin (N)
     `poslin' is a positive linear transfer function used
 by neural
     networks


# name: <cell-element>
# type: sq_string
# elements: 1
# length: 73
`poslin' is a positive linear transfer function used
 by neural
networks



# name: <cell-element>
# type: sq_string
# elements: 1
# length: 7
poststd


# name: <cell-element>
# type: sq_string
# elements: 1
# length: 152
 -- Function File:  [PP,TT] = poststd(PN,MEANP,,STDP,TN,MEANT,STDT)
     `poststd' postprocesses the data which has been preprocessed by
     `prestd'.


# name: <cell-element>
# type: sq_string
# elements: 1
# length: 73
`poststd' postprocesses the data which has been preprocessed by
`prestd'.



# name: <cell-element>
# type: sq_string
# elements: 1
# length: 6
prestd


# name: <cell-element>
# type: sq_string
# elements: 1
# length: 159
 -- Function File:  [PN,MEANP,STDP,TN,MEANT,STDT] =prestd(P,T)
     `prestd' preprocesses the data so that the mean is 0 and the
     standard deviation is 1.


# name: <cell-element>
# type: sq_string
# elements: 1
# length: 80
`prestd' preprocesses the data so that the mean is 0 and the standard
deviation 



# name: <cell-element>
# type: sq_string
# elements: 1
# length: 7
purelin


# name: <cell-element>
# type: sq_string
# elements: 1
# length: 105
 -- Function File:  A= purelin (N)
     `purelin' is a linear transfer function used
 by neural networks


# name: <cell-element>
# type: sq_string
# elements: 1
# length: 65
`purelin' is a linear transfer function used
 by neural networks



# name: <cell-element>
# type: sq_string
# elements: 1
# length: 6
radbas


# name: <cell-element>
# type: sq_string
# elements: 1
# length: 98
 -- Function File:  radbas (N)
     Radial basis transfer function.

     `radbas(n) = exp(-n^2)'



# name: <cell-element>
# type: sq_string
# elements: 1
# length: 31
Radial basis transfer function.



# name: <cell-element>
# type: sq_string
# elements: 1
# length: 6
satlin


# name: <cell-element>
# type: sq_string
# elements: 1
# length: 104
 -- Function File:  [A = satlin (N)
     A neural feed-forward network will be trained with `trainlm'
 


# name: <cell-element>
# type: sq_string
# elements: 1
# length: 63
A neural feed-forward network will be trained with `trainlm'
 



# name: <cell-element>
# type: sq_string
# elements: 1
# length: 7
satlins


# name: <cell-element>
# type: sq_string
# elements: 1
# length: 105
 -- Function File:  [A = satlins (N)
     A neural feed-forward network will be trained with `trainlm'
 


# name: <cell-element>
# type: sq_string
# elements: 1
# length: 63
A neural feed-forward network will be trained with `trainlm'
 



# name: <cell-element>
# type: sq_string
# elements: 1
# length: 13
saveMLPStruct


# name: <cell-element>
# type: sq_string
# elements: 1
# length: 118
 -- Function File:  saveMLPStruct (NET,STRFILENAME)
     `saveStruct' saves a neural network structure to *.txt files


# name: <cell-element>
# type: sq_string
# elements: 1
# length: 61
`saveStruct' saves a neural network structure to *.txt files



# name: <cell-element>
# type: sq_string
# elements: 1
# length: 3
sim


# name: <cell-element>
# type: sq_string
# elements: 1
# length: 216
 -- Function File: NETOUTPUT = sim (NET, MINPUT)
     `sim' is usuable to simulate a before defined neural network.
     `net' is created with newff(...) and MINPUT should be the
     corresponding input data set!


# name: <cell-element>
# type: sq_string
# elements: 1
# length: 61
`sim' is usuable to simulate a before defined neural network.



# name: <cell-element>
# type: sq_string
# elements: 1
# length: 6
subset


# name: <cell-element>
# type: sq_string
# elements: 1
# length: 1664
 -- Function File:  [MTRAIN, MTEST, MVALI] = subset
          (MDATA,NTARGETS,IOPTI,FTEST,FVALI)
     `subset' splits the main data matrix which contains inputs and
     targets into 2 or 3 subsets
 depending on the parameters. 
 
     The first parameter MDATA must be in row order. This means if the
     network
 contains three inputs, the matrix must be have 3 rows
     and x columns to define the
 data for the inputs. And some more
     rows for the outputs (targets), e.g. a neural network
 with three
     inputs and two outputs must have 5 rows with x columns!
 The
     second parameter NTARGETS defines the number or rows which
     contains the target values!
 The third argument `iOpti' is
     optional and can have three status:
 	   0: no optimization
        1: will randomise the column order and order the columns
     containing min and max values to be in the train set
     2: will
     NOT randomise the column order, but order the columns containing
     min and max values to be in the train set
 	   default
     value is `1'
 The fourth argument `fTest' is also optional and
     defines how 
 much data sets will be in the test set. Default
     value is `1/3'
 The fifth parameter `fTrain' is also optional and
     defines how
 much data sets will be in the train set. Default
     value is `1/6'
 So we have 50% of all data sets which are for
     training with the default values.
 
            [mTrain, mTest] = subset(mData,1)
            returns three subsets of the complete matrix
            with randomized and optimized columns!

            [mTrain, mTest] = subset(mData,1,)
            returns two subsets
     


# name: <cell-element>
# type: sq_string
# elements: 1
# length: 80
`subset' splits the main data matrix which contains inputs and targets
into 2 or



# name: <cell-element>
# type: sq_string
# elements: 1
# length: 6
tansig


# name: <cell-element>
# type: sq_string
# elements: 1
# length: 226
 -- Function File:  A = tansig (N)
     `tansig' is a non-linear transfer function used to train
 neural
     networks.
 This function can be used in newff(...) to create a
     new feed forward
 multi-layer neural network.
 


# name: <cell-element>
# type: sq_string
# elements: 1
# length: 74
`tansig' is a non-linear transfer function used to train
 neural
networks.



# name: <cell-element>
# type: sq_string
# elements: 1
# length: 5
train


# name: <cell-element>
# type: sq_string
# elements: 1
# length: 624
 -- Function File:  [NET] = train (MLPNET,MINPUTN,MOUTPUT,[],[],VV)
     A neural feed-forward network will be trained with `train'
 
          [net,tr,out,E] = train(MLPnet,mInputN,mOutput,[],[],VV);

          left side arguments:
            net: the trained network of the net structure `MLPnet'

          right side arguments:
            MLPnet : the untrained network, created with `newff'
            mInputN: normalized input matrix
            mOutput: output matrix (normalized or not)
            []     : unused parameter
            []     : unused parameter
            VV     : validize structure


# name: <cell-element>
# type: sq_string
# elements: 1
# length: 80
A neural feed-forward network will be trained with `train'
 
     [net,tr,out,E



# name: <cell-element>
# type: sq_string
# elements: 1
# length: 6
trastd


# name: <cell-element>
# type: sq_string
# elements: 1
# length: 481
 -- Function File:  PN = trastd (P,MEANP,STDP)
     `trastd' preprocess additional data for neural network
     simulation.
 
            `p'    : test input data
            `meanp': vector with standardization parameters of prestd(...)
            `stdp' : vector with standardization parameters of prestd(...)
          
            meanp = [2.5; 6.5];
            stdp = [1.2910; 1.2910];
            p = [1 4; 2 5];
          
            pn = trastd(p,meanp,stdp);



# name: <cell-element>
# type: sq_string
# elements: 1
# length: 66
`trastd' preprocess additional data for neural network simulation.



# name: <cell-element>
# type: sq_string
# elements: 1
# length: 7
vec2ind


# name: <cell-element>
# type: sq_string
# elements: 1
# length: 258
 -- Function File: IND = vec2ind (VECTOR)
     `vec2ind' convert vectors to indices

          EXAMPLE 1
          vec = [1 2 3; 4 5 6; 7 8 9];

          ind = vec2ind(vec)
          The prompt output will be:
          ans =
             1 2 3 1 2 3 1 2 3



# name: <cell-element>
# type: sq_string
# elements: 1
# length: 36
`vec2ind' convert vectors to indices





