Learning a Nonlinear Combination of Generalized Heterogeneous Classifiers

Publish Year: 1402
نوع سند: مقاله ژورنالی
زبان: English
View: 126

This Paper With 18 Page And PDF Format Ready To Download

  • Certificate
  • من نویسنده این مقاله هستم

استخراج به نرم افزارهای پژوهشی:

لینک ثابت به این Paper:

شناسه ملی سند علمی:

JR_JADM-11-1_007

تاریخ نمایه سازی: 20 فروردین 1402

Abstract:

Finding an effective way to combine the base learners is an essential part of constructing a heterogeneous ensemble of classifiers. In this paper, we propose a framework for heterogeneous ensembles, which investigates using an artificial neural network to learn a nonlinear combination of the base classifiers. In the proposed framework, a set of heterogeneous classifiers are stacked to produce the first-level outputs. Then these outputs are augmented using several combination functions to construct the inputs of the second-level classifier. We conduct a set of extensive experiments on ۱۲۱ datasets and compare the proposed method with other established and state-of-the-art heterogeneous methods. The results demonstrate that the proposed scheme outperforms many heterogeneous ensembles, and is superior compared to singly tuned classifiers. The proposed method is also compared to several homogeneous ensembles and performs notably better. Our findings suggest that the improvements are even more significant on larger datasets.

Authors

M. Rahimi

Faculty of Computer Engineering, Shahrood University of Technology, Shahrood, Iran.

A. A. Taheri

Faculty of Computer Engineering, Shahrood University of Technology, Shahrood, Iran.

H. Mashayekhi

Faculty of Computer Engineering, Shahrood University of Technology, Shahrood, Iran.

مراجع و منابع این Paper:

لیست زیر مراجع و منابع استفاده شده در این Paper را نمایش می دهد. این مراجع به صورت کاملا ماشینی و بر اساس هوش مصنوعی استخراج شده اند و لذا ممکن است دارای اشکالاتی باشند که به مرور زمان دقت استخراج این محتوا افزایش می یابد. مراجعی که مقالات مربوط به آنها در سیویلیکا نمایه شده و پیدا شده اند، به خود Paper لینک شده اند :
  • W. J. Tastle, Data mining applications using artificial adaptive systems, ...
  • J. Large, J. Lines, and A. Bagnall, “A probabilistic classifier ...
  • J. Kittler, M. Hatef, R. P. W. Duin, and J. ...
  • L. I. Kuncheva and J. J. Rodríguez, “A weighted voting ...
  • L. I. Kuncheva, “Switching between selection and fusion in combining ...
  • H. R. Kadkhodaei, AME Moghadam, and M. Dehghan, “HBoost: A ...
  • E. Soltanmohammadi, M. Naraghi-Pour, and M. van der Schaar, “Context-based ...
  • F. Pinagé, dos EM. Santos, and J. Gama, “A drift ...
  • T. T. Nguyen, N. Van Pham, M. T. Dang, A. ...
  • K. Zhao, T. Matsukawa, E. Suzuki, “Experimental validation for N-ary ...
  • D. Jimenez, “Dynamically weighted ensemble neural networks for classification,” in ...
  • J. Z. Kolter and M. A. Maloof, “Dynamic weighted majority: ...
  • Y. Zhang, G. Cao, B. Wang, and X. Li, “A ...
  • D. H. Wolpert, “Stacked generalization,” Neural Networks, vol. ۵, no. ...
  • J. Kittler and F. M. Alkoot, “Sum versus vote fusion ...
  • P. K. Chan and S. J. Stolfo, “Learning Arbiter and ...
  • A. Petrakova, M. Affenzeller, and G. Merkurjeva, “Heterogeneous versus Homogeneous ...
  • M. P. Sesmero, A. I. Ledezma, and A. Sanchis, “Generating ...
  • K. M. Ting and I. H. Witten, “Issues in Stacked ...
  • T. K. Ho, “The Random Subspace Method for Constructing Decision ...
  • T. Chen and C. Guestrin, “XGBoost: A scalable tree boosting ...
  • C. E. Brodley and T. Lane, “Creating and exploiting coverage ...
  • H. Drucker, “Improving regressors using boosting techniques,” in ۱۴th International ...
  • L. I. Kuncheva and E. Alpaydin, Combining Pattern Classifiers: Methods ...
  • S. B. Oh, “On the relationship between majority vote accuracy ...
  • R. M. O. Cruz, R. Sabourin, and G. D. C. ...
  • T. Zhang and G. Chi, “A heterogeneous ensemble credit scoring ...
  • M. Smȩtek and B. Trawiński, “Selection of heterogeneous fuzzy model ...
  • E. Menahem, L. Rokach, and Y. Elovici, “Combining one-class classifiers ...
  • G. Tsoumakas, L. Angelis, and I. Vlahavas, “Selective fusion of ...
  • S. Džeroski and B. Ženko, “Is combining classifiers with stacking ...
  • T. T. Nguyen, A. V. Luong, M. T. Dang, A. ...
  • R. Caruana, A. Munson and A. Niculescu-Mizil, "Getting the Most ...
  • A. M. Webb et al., “To Ensemble or Not Ensemble: ...
  • L. Rokach, “Ensemble Methods for Classifiers,” in Data Mining and ...
  • Y. Baghoussi and J. Mendes-Moreira, “Instance-Based Stacked Generalization for Transfer ...
  • A. K. Seewald and J. Fürnkranz, “An evaluation of grading ...
  • A. K. Seewald. and J. Fürnkranz, “Grading classifiers,” Austrian Research ...
  • A. K. Seewald, “Towards a Theoretical Framework for Ensemble Classification ...
  • D. V. Sridhar, R. C. Seagrave, and E. B. Bartlett, ...
  • A. Ledezma, R. Aler, A. Sanchis, and D. Borrajo, “GA-stacking: ...
  • P. Shunmugapriya and S. Kanmani, “Optimization of stacking ensemble configurations ...
  • N. Rooney, D. Patterson, and C. Nugent, “Non-strict heterogeneous Stacking,” ...
  • Y. Xia, C. Liu, B. Da, and F. Xie, A ...
  • M. Massaoudi, S. S. Refaat, I. Chihi, M. Trabelsi, F. ...
  • J. Yan and S. Han, “Classifying Imbalanced Data Sets by ...
  • S. Rajaraman et al., “A novel stacked generalization of models ...
  • Z. Eivazpour and M. R. Keyvanpour, “CSSG: A cost-sensitive stacked ...
  • K. Akyol, “Stacking ensemble based deep neural networks modeling for ...
  • A. Das, S. Roy, U. Bhattacharya, and S. K. Parui, ...
  • J. C. Platt, “Probabilistic Outputs for Support Vector Machines and ...
  • P. Domingos and F. Provost, “Tree Induction for Probability-Based Ranking,” ...
  • R. Caruana, A. Niculescu-Mizil, G. Crew, and A. Ksikes, “Ensemble ...
  • A. K. Seewald and J. Fuernkranz, “An evaluation of grading ...
  • L. Breiman, “Bagging predictors,” Mach. Learn., vol. ۲۴, no. ۲, ...
  • J. Friedman, T. Hastie, and R. Tibshirani, “Additive logistic regression: ...
  • J. J. Rodriguez, L. I. Kuncheva, and C. J. Alonso, ...
  • نمایش کامل مراجع