Tham khảo tài liệu 'artificial mind system – kernel memory approach - tetsuya hoya part 6', kỹ thuật - công nghệ, cơ khí - chế tạo máy phục vụ nhu cầu học tập, nghiên cứu và làm việc hiệu quả | Comparison Between Commonly Used Connectionist Models 25 16 0 14 12 10 8 6 4 2 . 1 . Letter 1-2 solid line 1 Letter 1-4 Letter 1-8 Letter 1-16 solid line 2 2 5 10 15 20 25 Number of New Classes Accommodated Fig. . Transition of the deterioration rate with varying the number of new classes accommodated - ISOLET data set with the other three data sets. This is perhaps due to the insufficient number of pattern vectors and thereby the weak coverage of the pattern space. Nevertheless it is stated that by exploiting the flexible configuration property of a PNN the separation of pattern space can be kept sufficiently well for each class even when adding new classes as long as the amount of the training data is not excessive for each class. Then as discussed above this is supported by the empirical fact that the generalisation performance was not seriously deteriorated for almost all the cases. It can therefore be concluded that any catastrophic forgetting of the previously stored data due to accommodation of new classes did not occur which meets Criterion 4 . Comparison Between Commonly Used Connectionist Models and PNNs GRNNs In practice the advantage of PNNs GRNNs is that they are essentially free from the baby-sitting required for . MLP-NNs or SOFMs . the necessity to tune a number of network parameters to obtain a good convergence rate or worry about any numerical instability such as local minima or long 26 2 From Classical Connectionist Models to PNNs GRNNs and iterative training of the network parameters. As described earlier by exploiting the property of PNNs GRNNs simple and quick incremental learning is possible due to their inherently memory-based architecture6 whereby the network growing shrinking is straightforwardly performed Hoya and Chambers 2001a Hoya 2004b . In terms of the generalisation capability within the pattern classification context PNNs GRNNs normally exhibit similar capability as compared with MLP-NNs in Hoya 1998 such a .