"On Dynamic Soft Dimension Reduction in Evolving Fuzzy Classifiers"
, in Hüllermeier, Kruse, Hoffmann: Lecture Notes in Computer Sciences, Serie LNAI, Vol. 6178, Springer Verlag, Seite(n) 79-88, 7-2010
On Dynamic Soft Dimension Reduction in Evolving Fuzzy Classifiers
Sprache des Titels:
Lecture Notes in Computer Sciences
This paper deals with the problem of dynamic dimension reduction during the on-line update and evolution of fuzzy classifiers. With ?dynamic? it is meant that the importance of features for discriminating between the classes changes over time when new data is sent into the classifiers? update mechanisms. In order to avoid discontinuity in the incremental learning process, i.e. permanently exchanging some features in the input structure of the fuzzy classifiers, we include feature weights (lying in [0,1]) into the training and update of the fuzzy classifiers, which measure the importance levels of the various features and can be smoothly updated with new incoming samples. In some cases, when the weights become (approximately) 0, an automatic switching off of some features and therefore a (soft) dimension reduction is achieved. The approaches for incrementally updating the feature weights are based on a leave-one-feature-out and on a feature-wise separability criterion. We will describe the integration concept of the feature weights in evolving fuzzy classifiers using single and multi-model architecture. The whole approach will be evaluated based on high-dimensional on-line real-world classification scenarios.