This command trains a backpropagation network using the Levenberg-Marquard training algorithm. The method is based on approximating the second order derivatives by using the first order derivatives. The method has quite sensitive regularization parameters. By default one neuron for each input is used.
Example (ex5.12): Train a three-layered (input + hidden + output layer) network with a sine function using a sigmoid activation function in neurons. After training is complete, print the training and testing error graphs.
NDA> load sint.dat NDA> load sink.dat NDA> select sinx -f sink.ox NDA> select siny -f sink.oy NDA> select sinox -f sink.ox NDA> select sintx -f sint.tx NDA> select sinty -f sint.ty NDA> lmbp -di sinx -do siny -full -net 2 3 1 -types s s s -nout wei -ef virhe -lamda 1.0 -mup 2.0 -mdn 0.8 -em 40 -ti sintx -to sinty NDA> fbp -di sinox -do out -win wei NDA> mkgrp xxx NDA> ldgrv xxx -f virhe.TrainError -co black NDA> ldgrv xxx -f virhe.TestError -co red NDA> show xxx