When training any classifier, the goal is to train it so that it generalises well for unknown data. However, in a real application, overfitting is often a problem, i.e., the classifier adapts too well to the training data and is not able to classify new samples correctly anymore. If this is the case, the classifier will show very good accuracy on the training data set, but will fail to classify other data correctly. One common cause for overfitting is when only a limited number of training samples is available. While a larger set of training data is always the better solution, HALCON also offers the possibility to add a regularisation during the training of a multilayer perceptron (MLP) to prevent overfitting and to smooth the decision boundaries between classes.
An MLP is described by a set of weights that connect all nodes of one layer to all of the nodes of the next layer, and during training these weights are adapted iteratively by minimising a loss function. When examining the trained weights of an MLP that shows symptoms of overfitting, you will usually see very large values for some of the weights. So by adding a regularisation to the MLP training, a penalty term that keeps the weights low is added to the optimisation that is performed during training.
The effect of the penalty term itself is controlled by a set of parameters called the weight priors. The HALCON operator set_regularization_params_class_mlp offers to either set these parameters manually, or let the automatic mode estimate these parameters by itself. Independent of how the parameters are chosen, it is good practice to always verify the performance of the MLP trained with regularisation on an independent test set that is not part of the training data.
If the regularisation parameters are set manually, the only parameter that needs to be set with the operator set_regularization_params_class_mlp is the ‘weight_prior’ parameter itself. The larger this value is chosen, the smoother the decision boundaries will be. When the weight priors should be determined automatically, the parameter ‘num_outer_iterations’ must be set to a value >= 1 and the parameter ‘weight_priors‘ must be set to determine the initial value for the parameters. For both, the manual and automatic mode, a good initial guess for the weight priors is in the range between 0.01 and 0.1.