Abstract
In a preceding contribution, we conducted a study considering a fuzzy multiclassifier system (MCS) design framework based on Fuzzy Unordered Rule Induction Algorithm (FURIA). It served as the fuzzy rule classification learning algorithm to derive the component classifiers considering bagging and feature selection. In this work, we integrate this approach under the overproduce-and-choose strategy. A state-of-the-art evolutionary multiobjective algorithm, namely NSGA-II, is used to provide a component classifier selection and improve FURIA-based fuzzy MCS. We propose five different fitness functions based on three different optimization criteria, accuracy, complexity, and diversity. Twenty UCI high dimensional datasets were considered in order to conduct the experiments. A combination between accuracy and diversity criteria provided very promising results, becoming competitive with classical MCS learning methods.
Article PDF
Similar content being viewed by others
References
L. Kuncheva. Combining Pattern Classifiers: Methods and Algorithms. Wiley, 2004.
S.J. Verzi, G.L. Heileman, and M. Georgiopoulos. Boosted ARTMAP: Modifications to fuzzy ARTMAP motivated by boosting theory. Neural Networks, 19(4):446–468, 2006.
W. Pedrycz and K.C. Kwak. Boosting of granular models. Fuzzy Sets and Systems, 157(22):2934–2953, 2006.
H. Takahashi and H. Honda. A new reliable cancer diagnosis method using boosted fuzzy classifier with a SWEEP operator method. Journal of Chemical Engineering of Japan, 38(9):763–773, 2005.
J. Canul-Reich, L. Shoemaker, and L.O. Hall. Ensembles of fuzzy classifiers. In IEEE International Conference on Fuzzy Systems (FUZZ-IEEE), pages 1–6, London (UK), 2007.
Y. Nojima and H. Ishibuchi. Genetic rule selection with a multi-classifier coding scheme for ensemble classifier design. International Journal of Hybrid Intelligent Systems, 4(3):157–169, 2007.
C. Marsala. Data mining with ensembles of fuzzy decision trees. In IEEE Symposium on Computational Intelligence and Data Mining, pages 348–354, Nashville (USA), 2009.
P. P. Bonissone, J. M. Cadenas, M. C. Garrido, and R. A. Díaz-Valladares. A fuzzy random forest. International Journal of Approximate Reasoning, 51(7):729–747, 2010.
O. Cordón, A. Quirin, and L. Sánchez. A first study on bagging fuzzy rule-based classification systems with multicriteria genetic selection of the component classifiers. In Third International Workshop on Genetic and Evolving Fuzzy Systems (GEFS), pages 11–16, Witten-Bommerholz (Germany), 2008.
O. Cordón and A. Quirin. Comparing two genetic overproduce-and-choose strategies for fuzzy rule-based multiclassification systems generated by bagging and mutual information-based feature selection. International Journal of Hybrid Intelligent Systems, 7(1):45–64, 2010.
K. Trawiński, A. Quirin, and O. Cordón. Bi-criteria genetic selection of bagging fuzzy rule-based multiclassification systems. In IFSA/EUSFLAT Conf., pages 1514–1519, 2009.
K. Trawiński, A. Quirin, and O. Cordón. On the combination of accuracy and diversity measures for genetic selection of bagging fuzzy rule-based multiclassification systems. International Conference on Intelligent Systems Design and Applications (ISDA), 0:121–127, 2009.
D. Partridge and W.B. Yates. Engineering multiversion neural-net systems. Neural Computation, 8(4):869–893, 1996.
O. Cordón, F. Herrera, F. Hoffmann, and L. Magdalena. Genetic Fuzzy Systems. Evolutionary Tuning and Learning of Fuzzy Knowledge Bases. World Scientific, 2001.
O. Cordón, F. Gomide, F. Herrera, F. Hoffmann, and L. Magdalena. Ten years of genetic fuzzy systems: Current framework and new trends. Fuzzy Sets and Systems, 141(1):5–31, 2004.
F. Herrera. Genetic fuzzy systems: taxonomy, current research trends and prospects. Evolutionary Intelligence, 1(1):27–46, January 2008.
K. Trawiński, O. Cordón, and A. Quirin. On designing fuzzy multiclassifier systems by combining furia with bagging and feature selection. International Journal of Uncertainty, Fuzziness and Knowledge-Based Systems, 19(4):589–633, 2011.
J. C. Hühn and E. Hüllermeier. FURIA: an algorithm for unordered fuzzy rule induction. Data Mining and Knowledge Discovery, 19(3):293–319, 2009.
J. C. Hühn and E. Hüllermeier. An analysis of the FURIA algorithm for fuzzy rule induction. In Advances in Machine Learning I, pages 321–344. 2010.
T. Ho. The random subspace method for constructing decision forests. IEEE Transactions on Pattern Analysis and Machine Intelligence, 20(8):832–844, 1998.
R. Battiti. Using mutual information for selecting features in supervised neural net learning. IEEE Transactions on Neural Networks, 5(4):537–550, 1994.
T.A. Feo and M.G.C. Resende. Greedy randomized adaptive search procedures. Journal of Global Optimization, 6:109–133, 1995.
H. Ishibuchi, T. Nakashima, and M. Nii. Classification and Modeling with Linguistic Information Granules: Advanced Approaches to Linguistic Data Mining (Advanced Information Processing). Springer-Verlag New York, Inc., Secaucus, NJ, USA, 2004.
C.A. Coello, G.B. Lamont, and D.A. Van Veldhuizen. Evolutionary Algorithms for Solving Multi-Objective Problems, 2nd Edition. Springer, 2007.
Ludmila I. Kuncheva and Christopher J. Whitaker. Measures of diversity in classifier ensembles and their relationship with the ensemble accuracy. Machine Learning, 51(2):181–207, 2003.
D. Ruta and B. Gabrys. Classifier selection for majority voting. Information Fusion, 6(1):63–81, 2005.
A. Tsymbal, M. Pechenizkiy, and P. Cunningham. Diversity in search strategies for ensemble feature selection. Information Fusion, 6(1):83–98, 2005.
K. Deb, A. Pratap, S. Agarwal, and T. Meyarivan. A fast and elitist multiobjective genetic algorithm: NSGA-II. IEEE Transactions on Evolutionary Computation, 6:182–197, 2002.
E.M. Dos Santos, R. Sabourin, and P. Maupin. Single and multi-objective genetic algorithms for the selection of ensemble of classifiers. In International Joint Conference on Neural Networks (IJCNN), pages 3070–3077, Vancouver, 2006.
E.M. Dos Santos, R. Sabourin, and P. Maupin. A dynamic overproduce-and-choose strategy for the selection of classifier ensembles. Pattern Recognition, 41(10):2993–3009, 2008.
T.G. Dietterich. An experimental comparison of three methods for constructing ensembles of decision trees: bagging, boosting, and randomization. Machine Learning, 40(2):139–157, 2000.
R.E. Banfield, L.O. Hall, K.W. Bowyer, and W.P. Kegelmeyer. A comparison of decision tree ensemble creation techniques. IEEE Transactions on Pattern Analysis and Machine Intelligence, 29(1):173–180, 2007.
D. Optiz and R. Maclin. Popular ensemble methods: an empirical study. Journal of Artificial Intelligence Research, 11:169–198, 1999.
L. Breiman. Bagging predictors. Machine Learning, 24(2):123–140, 1996.
R. Schapire. The strength of weak learnability. Machine Learning, 5(2):197–227, 1990.
Z.H. Zhou. Ensembling local learners through multimodal perturbation. IEEE Transactions of Systems, Man, and Cybernetics, Part B: Cybernetics, 35(4):725–735, 2005.
L. Xu, A. Krzyzak, and C.Y. Suen. Methods of combining multiple classifiers and their application to handwriting recognition. IEEE Transactions on Systems, Man, and Cybernetics, 22(3):418–435, 1992.
L. Breiman. Random forests. Machine Learning, 45(1):5–32, 2001.
M.J. del Jesus, F. Hoffmann, L.J. Navascues, and L. Sánchez. Induction of fuzzy-rule-based classifiers with evolutionary boosting algorithms. IEEE Transactions on Fuzzy Systems, 12(3):296–308, 2004.
L. Sánchez and J. Otero. Boosting fuzzy rules in classification problems under single-winner inference. International Journal of Intelligent Systems, 22(9):1021–1034, 2007.
H. Takahashi and H. Honda. Lymphoma prognostication from expression profiling using a combination method of boosting and projective adaptive resonance theory. Journal of Chemical Engineering of Japan, 39(7):767–771, 2006.
C. Z. Janikow. Fuzzy decision trees: issues and methods. IEEE Transactions on Systems, Man, and Cybernetics, Part B, 28(1):1–14, 1998.
J.J. Aguilera, M. Chica, M.J. del Jesus, and F. Herrera. Niching genetic feature selection algorithms applied to the design of fuzzy rule based classification systems. In IEEE International Conference on Fuzzy Systems (FUZZ-IEEE), pages 1794–1799, London (UK), 2007.
Y. Nojima and H. Ishibuchi. Designing fuzzy ensemble classifiers by evolutionary multiobjective optimization with an entropy-based diversity criterion. In International Conference on Hybrid Intelligent Systems and Conference on Neuro-Computing and Evolving Intelligence, Auckland, New Zealand, 2006.
H. Ishibuchi and Y. Nojima. Evolutionary multiobjective optimization for the design of fuzzy rule-based ensemble classifiers. International Journal of Hybrid Intelligent Systems, 3(3):129–145, 2006.
R. Schapire, Y. Freund, P. Bartlett, and W. Lee. Boosting the margin: A new explanation for the effectiveness of voting methods. In International Conference on Machine Learning, pages 322–330, Nashville (USA), 1997.
B.V. Dasarathy and B.V. Sheela. A composite classifier system design: Concepts and methodology. Proceedings of IEEE, 67(5):708–713, 1979.
L.S. Oliveira, M. Morita, R. Sabourin, and F. Bortolozzi. Multi-objective genetic algorithms to create ensemble of classifiers. Lecture Notes in Computer Science, 3410:592–606, 2005.
W. W. Cohen. Fast effective rule induction. In In Proceedings of the Twelfth International Conference on Machine Learning, pages 115–123. Morgan Kaufmann, 1995.
H. Ishibuchi, T. Nakashima, and T. Morisawa. Voting in fuzzy rule-based systems for pattern classification problems. Fuzzy Sets and Systems, 103(2):223–238, 1999.
O. Cordón, M.J. del Jesus, and F. Herrera. A proposal on reasoning methods in fuzzy rule-based classification systems. International Journal of Approximate Reasoning, 20:21–45, 1999.
P. Panov and S. Džeroski. Combining bagging and random subspaces to create better ensembles. In IDA’07: Proceedings of the 7th international conference on Intelligent data analysis, pages 118–129, Berlin, Heidelberg, 2007. Springer-Verlag.
E. Zitzler and L. Thiele. Multiobjective evolutionary algorithms: a comparative case study and the strength pareto approach. IEEE Transactions on Evolutionary Computation, 3:257–271, 1999.
J. R. Quinlan. C4.5: programs for machine learning. Morgan Kaufmann Publishers Inc., San Francisco, CA, USA, 1993.
T.G. Dietterich. Approximate statistical test for comparing supervised classification learning algorithms. Neural Computation, 10(7):1895–1923, 1998.
J. Knowles and D. Corne. On metrics for comparing nondominated sets. In 2002 Congress on Evolutionary Computation CEC ‘02, volume 1, pages 711–716, 2002.
Author information
Authors and Affiliations
Corresponding author
Rights and permissions
This is an open access article distributed under the CC BY-NC license (https://doi.org/creativecommons.org/licenses/by-nc/4.0/).
About this article
Cite this article
Trawiński, K., Cordón, O. & Quirin, A. A Study on the Use of Multiobjective Genetic Algorithms for Classifier Selection in FURIA-based Fuzzy Multiclassifiers. Int J Comput Intell Syst 5, 231–253 (2012). https://doi.org/10.1080/18756891.2012.685272
Received:
Accepted:
Published:
Issue date:
DOI: https://doi.org/10.1080/18756891.2012.685272

