next up previous contents
Next: The basic learning rule Up: Backpropagation Previous: Backpropagation

The basic learning rule

 

tabular1715

The test dataset (-ti, -to) is used as a stopping criterion. The learning does not actually stop but learning algorithm checks after every successful training epoch if the error of the test set is smaller than before during the training and saves the network weights. After the maximum number of the training epochs have been reached, these weights are saved to the namespace.

Example (ex5.6): Train a three-layered (input + hidden + output layer) network and save the network output.

NDA> load sin.dat
NDA> select sinx -f sin.x
NDA> select siny -f sin.y
NDA> bp -di sinx -do siny -net 3 1 10 1 -types s s s -em 2000
 -nout wei -ef virhe -bs 0.01 
NDA> fbp -di sinx -do out -win wei
NDA> select output -f sin.x out.0
NDA> save output



Erkki Hakkinen
Thu Sep 24 11:51:34 EET DST 1998