Browsing by Author "Luukka, Pasi"
Now showing 1 - 7 of 7
Results Per Page
Sort Options
Item Differential Evolution Based Nearest Prototype Classifier with Optimized Distance Measures and GOWA(Springer, 2015) Koloseni, David; Luukka, PasiNearest prototype classifier based on differential evolution algorithm, pool of distances and generalized ordered weighted averaging is introduced. Classifier is based on forming optimal ideal solutions for each class. Besides this also distance measures are optimized for each feature in the data sets to improve recognition process of which class the sample belongs. This leads to a distance vectors, which are now aggregated to a single distance by using generalized weighted averaging (GOWA). In earlier work simple sum was applied in the aggregation process. The classifier is empirically tested with seven data sets. The proposed classifier provided at least comparable accuracy or outperformed the compared classifiers, including the earlier versions of DE classifier and DE classifier with pool of distances.Item Differential Evolution Based Nearest Prototype Classifier with Optimized Distance Measures for the Features in the Data Sets(Elsevier, 2013) Koloseni, David; Lampinen, Jouni; Luukka, PasiIn this paper a further generalization of differential evolution based data classification method is proposed, demonstrated and initially evaluated. The differential evolution classifier is a nearest prototype vector based classifier that applies a global optimization algorithm, differential evolution, for determining the optimal values for all free parameters of the classifier model during the training phase of the classifier. The earlier version of differential evolution classifier that applied individually optimized distance measure for each new data set to be classified is generalized here so, that instead of optimizing a single distance measure for the given data set, we take a further step by proposing an approach where distance measures are optimized individually for each feature of the data set to be classified. In particular, distance measures for each feature are selected optimally from a predefined pool of alternative distance measures. The optimal distance measures are determined by differential evolution algorithm, which is also determining the optimal values for all free parameters of the selected distance measures in parallel. After determining the optimal distance measures for each feature together with their optimal parameters, we combine all featurewisely determined distance measures to form a single total distance measure, that is to be applied for the final classification decisions. The actual classification process is still based on the nearest prototype vector principle; A sample belongs to the class represented by the nearest prototype vector when measured with the above referred optimized total distance measure. During the training process the differential evolution algorithm determines optimally the class vectors, selects optimal distance metrics for each data feature, and determines the optimal values for the free parameters of each selected distance measure. Based on experimental results with nine well known classification benchmark data sets, the proposed approach yield a statistically significant improvement to the classification accuracy of differential evolution classifier.Item Differential Evolution Classifier with Optimized Distance Measures for the Features in the Data Sets(Springer, 2013) Koloseni, David; Lampinen, Jouni; Luukka, PasiIn this paper we propose a further generalization of differential evolution based data classification method. The current work extends our earlier differential evolution based nearest prototype classifier that includes optimization of the applied distance measure for the particular data set at hand. Here we propose a further generalization of the approach so, that instead of optimizing only a single distance measure for the given data set, now multiple distance measures are optimized individually for each feature in the data set. Thereby, instead of applying a single distance measure for all data features, we determine optimal distance measures individually for each feature. After the optimal class prototype vectors and optimal distance measures for each feature has been first determined, together with the optimal parameters related with each distance measure, in actual classification phase we combine the individually measured distances from each feature to form an overall distance measure between the class prototype vectors and sample. Each sample is then classified to the class assigned with the nearest prototype vector using that overall distance measure. The proposed approach is demonstrated and initially evaluated with three different data sets.Item Differential Evolution Classifier with Optimized Distance Measures from a Pool of Distances(2012) Koloseni, David; Lampinen, Jouni; Luukka, PasiIn this article we propose a differential evolution based nearest prototype classifier with extension to selecting the applied distance measure from a pool of alternative measures optimally for the particular data set at hand. The proposed method extends the earlier differential evolution based nearest prototype classifier by extending the optimization process to cover also the selection of distance measure instead of optimizing only the parameters related with a preselected and fixed distance measure. Now the optimization process is seeking also for the best distance measure providing the highest classification accuracy over the selected data set. It has been clear for some time that in classification, the usual euclidean distance measure is sometimes not the best possible choice. Still usually not much has been done for it, and in many cases where some consideration to this problem is given, there has only been testing with a couple of alternative distance measures to find which one provides the highest classification accuracy over the current data set. In this paper we attempt to take one step further by not only enumerating a couple of alternative distance measures, but applying a systematic optimization process to select the best distance measure from a pool of multiple alternative distance measures. In parallel, within the same optimization process, the optimal parameter values related to each alternative distance measures are determined as well as the optimal class prototype vectors for the given data. The empirical results represented are indicating that with several data sets the optimal distance measure is some other measure than the most commonly applied euclidean distance. The results are also suggesting that from the classification accuracy point of view the proposed global optimization approach has high potential in solving classification problems of the studied type. Perhaps the most generally applicable conclusion from our results is, that emphasizing of selection of distance measure is more important to classification accuracy that it has been commonly believed so far.Item Differential Evolution Classifier with Optimized OWA-Based Multi-distance Measures for the Features in the Data Sets(Springer, 2015) Koloseni, David; Fedrizzi, Mario; Luukka, Pasi; Lampinen, Jouni; Collan, MikaelThis paper introduces a new classification method that uses the differential evolution algorithm to feature-wise select, from a pool of distance measures, an optimal distance measure to be used for classification of elements. The distances yielded for each feature by the optimized distance measures are aggregated into an overall distance vector for each element by using OWA based multi-distance aggregation.Item Feature Selection using Yu's Similarity Measure and Fuzzy Entropy Measures(IEEE, 2012) Iyakaremye, Cesar; Luukka, Pasi; Koloseni, DavidIn classification problems feature selection has an important role for several reasons. It can reduce computational cost by simplifying the model. Also when the model is taken for practical use fewer inputs are needed which means in practice, that fewer measurements from new samples are needed. Removing insignificant features from the data set makes the model more transparent and more comprehensible. In this way the model can be used to provide better explanation to the medical diagnosis, which is an important requirement in medical applications. Feature selection process can also reduce noise, this way enhancing the classification accuracy. In this article feature selection method based similarity measure using Yu's similarity with fuzzy entropy measures is introduced and it is tested together with the similarity classifier. Model was tested with dermatology data set. When comparing the results to previous works the results compare quite well. Mean classification accuracy with dermatology data set was 98.83% and it was achieved using 33 features instead of 34 original features. Results can be considered quite good.Item Optimized Distance Metrics for Differential Evolution Based Nearest Prototype Classifier(Elsevier, 2012) Koloseni, David; Lampinen, Jouni; Luukka, PasiIn this article, we introduce a differential evolution based classifier with extension for selecting automatically the applied distance measure from a predefined pool of alternative distances measures to suit optimally for classifying the particular data set at hand. The proposed method extends the earlier differential evolution based nearest prototype classifier by extending the optimization process by optimizing not only the required parameters for distance measures, but also optimizing the selection of the distance measure it self in order to find the best possible distance measure for the particular data set at hand. It has been clear for some time that in classification, usual euclidean distance is often not the best choice, and the optimal distance measure depends on the particular properties of the data sets to be classified. So far solving this issue have been subject to a limited attention in the literature. In cases where some consideration to this is problem is given, there has only been testing with couple distance measure to find which one applies best to the data at hand. In this paper we have attempted to take one step further by applying a systematic global optimization approach for selecting the best distance measure from a set of alternative measures for obtaining the highest classification accuracy for the given data. In particular, we have generated pool of distance measures for the purpose and developed a model on how the differential evolution based classifier can be extended to optimize the selection of the distance measure for given data. The obtained results are demonstrating, and also confirming further on the earlier findings reported in the literature, that often some other distance measure than the most commonly used euclidean distance is the best choice. The selection of distance measure is one of the most important factor for obtaining best classification accuracy, and should thereby be emphasized more in future research. The results also indicate that it is possible to build a classifier that is selecting the optimal distance measure for the given data automatically. It is also recommended that the proposed extension the differential evolution based classifier is clearly efficient alternative in solving classification problems.