next up previous contents
Next: Importing a new data Up: Backpropagation Previous: The RPROP training

The Levenberg-Marquard training

 

lmbp Create a multi-layer perceptron network using the Levenberg-Marquard training
-di <i-data> name of the input data frame for training the network
-do <o-data> name of the target data frame for training the network
-net <nlayers> <nneu1> ... <nneuN> network configuration, number of layers, number of neurons in each layer
[-types <s | t | l> ... <s | t | l>] use a sigmoid, tanh or linear network in each layer, default is sigmoid
[-ti <ti-data>] input data frame for a test set
[-to <to-data>] target data frame for a test set
[-vi <vi-data>] input data frame for a validation set
[-vo <vo-data>] target data frame for a validation set
-nout <wdata> data frame for saving the trained network weights
[-ef <edata>] output error to a frame
[-lamda <lamda>] regularization parameter for controlling the step size (default is 1.0)
[-mdm <mdown>] regularization parameter multiplier downwards (default is 1.1)
[-mup <mup>] regularization parameter multiplier upwards (default is 0.9)
[-one] forces one neuron / one input
[-penalty <penalty>] regularization coefficient

This command trains a backpropagation network using the Levenberg-Marquard training algorithm. This method is based on approximating the second order derivatives by using the first order derivatives. The method is quite sensitive to its regularization parameters. By default, one neuron for each input is used.

Example (ex5.12): Train a three-layer (input + hidden + output layer) MLP network with sine data using sigmoid activation functions in neurons. After training is complete, print the training and testing error graphs.

NDA> load sint.dat
NDA> load sink.dat
NDA> select sinx -f sink.ox
NDA> select siny -f sink.oy
NDA> select sinox -f sink.ox
NDA> select sintx -f sint.tx
NDA> select sinty -f sint.ty
NDA> lmbp -di sinx -do siny -net 2 3 1 -types s s s -nout wei
      -ef virhe -lamda 1.0 -mup 2.0 -mdn 0.8 -em 40
      -ti sintx -to sinty
NDA> fbp -d sinox -dout out -win wei
NDA> mkgrp xxx
NDA> ldgrv xxx -f virhe.TrainError -co black
NDA> ldgrv xxx -f virhe.TestError -co red
NDA> show xxx

figure2386


next up previous contents
Next: Importing a new data Up: Backpropagation Previous: The RPROP training

Anssi Lensu
Thu May 17 15:00:44 EET DST 2001