next up previous contents
Next: Classifying data by a Up: The Tree-Structured Self-Organizing Map Previous: The Tree-Structured Self-Organizing Map

Training a TS-SOM

 

tabular1544

The SOM training creates the structure of the TS-SOM and organizes it. The result includes the structure of the network, stored in the structure with the given name <som>. In addition, the weight matrix of the TS-SOM is stored in a data frame with the name <som>"_W", as described in the figure above.

The command has lots of parameters, from which the first three are mostly used. Concerning other parameters, if you are not sure about their use, then you will probably get the best result with the default parameters. The parameters described in greater detail are:

-d <data>:
Data frame for the SOM training.
-sout <som>:
The name for the result structure and the weight matrix.
-l <layers>:
The number of the layers in the TS-SOM. The default value is 3.
-D <dim>:
The dimension of the TS-SOM. This defines the dimension of the TS-SOM i.e. the dimension of the SOM in each layer of the TS-SOM. The default value is 2.
-t <type>:
The type of the topology: 0 = lattice, 1=ring, 2 = tree-structured vector quantifier.
-e <weightning>:
The factor for weightning the neighbors of the neurons during the training process.
-c <stop_crit>:
The stopping criteria is defined through quantifying the error.
-m <max_item>:
The maximum number of the epochs when training one layer of the TS-SOM.
-L:
The lookup table (references from data vectors to their BMUs) is used as default. You can pass that by setting this flag.
-f <corr_layer>:
The number of the layers from which the lookup tables will be corrected. Larger value provides better results but slow down the training.
-r <train_rule>:
The training rule: 0 = vector quantization (VQ), 1 = spreading. The first rule tries to follow the distribution of the data, while the second rule tries to spread the neurons over data points as completely as possible.

Example (ex5.1): A training data is created by preprocessing, and the SOM is trained by it. In addition, the SOM is used in the classification (see the command somcl in Sect. 5.1.2).

...
NDA> prepro -d boston -dout predata -e -n
NDA> somtr -d predata -sout som1 -l 4
NDA> somcl -d predata -s som1 -cout cld1
...


next up previous contents
Next: Classifying data by a Up: The Tree-Structured Self-Organizing Map Previous: The Tree-Structured Self-Organizing Map

Erkki Hakkinen
Thu Sep 24 11:51:34 EET DST 1998