Comparing different stopping criteria for fuzzy decision tree induction through IDFID۳

Publish Year: 1393
نوع سند: مقاله ژورنالی
زبان: English
View: 154

This Paper With 22 Page And PDF Format Ready To Download

  • Certificate
  • من نویسنده این مقاله هستم

این Paper در بخشهای موضوعی زیر دسته بندی شده است:

استخراج به نرم افزارهای پژوهشی:

لینک ثابت به این Paper:

شناسه ملی سند علمی:

JR_IJFS-11-1_003

تاریخ نمایه سازی: 31 خرداد 1401

Abstract:

Fuzzy Decision Tree (FDT) classifiers combine decision trees with approximate reasoning offered by fuzzy representation to deal with language and measurement uncertainties. When a FDT induction algorithm utilizes stopping criteria for early stopping of the tree's growth, threshold values of stopping criteria will control the number of nodes. Finding a proper threshold value for a stopping criterion is one of the greatest challenges to be faced in FDT induction. In this paper, we propose a new method named Iterative Deepening Fuzzy ID۳ (IDFID۳) for FDT induction that has the ability of controlling the tree’s growth via dynamically setting the threshold value of stopping criterion in an iterative procedure. The final FDT induced by IDFID۳ and the one obtained by common FID۳ are the same when the numbers of nodes of induced FDTs are equal, but our main intention for introducing IDFID۳ is the comparison of different stopping criteria through this algorithm. Therefore, a new stopping criterion named Normalized Maximum fuzzy information Gain multiplied by Number of Instances (NMGNI) is proposed and IDFID۳ is used for comparing it against the other stopping criteria. Generally speaking, this paper presents a method to compare different stopping criteria independent of their threshold values utilizing IDFID۳. The comparison results show that FDTs induced by the proposed stopping criterion in most situations are superior to the others and number of instances stopping criterion performs better than fuzzy information gain stopping criterion in terms of complexity (i.e. number of nodes) and classification accuracy. Also, both tree depth and fuzzy information gain stopping criteria, outperform fuzzy entropy, accuracy and number of instances in terms of mean depth of generated FDTs.

Authors

Mohsen Zeinalkhani

Department of Computer Engineering, Shahid Bahonar Uni- versity of Kerman, Kerman, Iran

Mahdi Eftekhari

Department of Computer Engineering, Shahid Bahonar University of Kerman, Kerman, Iran

مراجع و منابع این Paper:

لیست زیر مراجع و منابع استفاده شده در این Paper را نمایش می دهد. این مراجع به صورت کاملا ماشینی و بر اساس هوش مصنوعی استخراج شده اند و لذا ممکن است دارای اشکالاتی باشند که به مرور زمان دقت استخراج این محتوا افزایش می یابد. مراجعی که مقالات مربوط به آنها در سیویلیکا نمایه شده و پیدا شده اند، به خود Paper لینک شده اند :
  • bibitem{Ref۰۱}J. Alcalá-Fdez, A. Fernandez, J. Luengo, J. Derrac, S. García, ...
  • bibitem{Ref۰۲}J. Alcalá-Fdez, L. Sánchez, S. García, M. del Jesus, S. ...
  • bibitem{Ref۰۳}L. Bartczuk and D. Rutkowska, textit{Type-۲ fuzzy decision trees}, In: ...
  • bibitem{Ref۰۴}R. B. Bhatt and M. Gopal, textit{Neuro-fuzzy decision trees}, International ...
  • bibitem{Ref۰۵}B. Chandra, P. Paul Varghese, {it Fuzzy sliq decision tree ...
  • bibitem{Ref۰۶}B. Chandra and P. Paul Varghese, textit{Fuzzifying gini index based ...
  • bibitem{Ref۰۷}P. C. Chang, C. Y. Fan and W. Y. Dzan, ...
  • bibitem{Ref۰۸}Y. L. Chen, T. Wang, B. S. Wang and Z. ...
  • bibitem{Ref۰۹}M. E. Cintra, M. C. Monard and H. A. Camargo, ...
  • bibitem{Ref۱۰}J. Demšar, textit{Statistical comparisons of classifiers over multiple data sets}, ...
  • bibitem{Ref۱۱}A. Frank and A. Asuncion, textit{UCI machine learning repository}, http://archive.ics.uci.edu/ml, ...
  • bibitem{Ref۱۲}S. Garcia and F. Herrera, textit{An extension on "statistical comparisons ...
  • bibitem{Ref۱۳}J. S. R. Jang, C. T. Sun and E. Mizutani, ...
  • bibitem{Ref۱۴}C. Z. Janikow, textit{Fuzzy decision trees: issues and methods}, IEEE ...
  • bibitem{Ref۱۵}R. Jensen and Q. Shen, textit{Fuzzy-rough feature significance for fuzzy ...
  • bibitem{Ref۱۶}U. Khan, H. Shin, J. Choi and M. Kim, textit{wFDT ...
  • bibitem{Ref۱۷}R. E. Korf, textit{Depth-first iterative-deepening: an optimal admissible tree search}, ...
  • bibitem{Ref۱۸}D. McNeill and P. Freiberger, textit{Fuzzy logic}, Simon & Schuster, ...
  • bibitem{Ref۱۹}T. M. Mitchell, textit{Machine learning}, McGraw-Hill, New York, ۱۹۹۷ ...
  • bibitem{Ref۲۰}W. Pedrycz and Z. A. Sosnowski, textit{Designing decision trees with ...
  • bibitem{Ref۲۱}W. Pedrycz and Z. A. Sosnowski, textit{Genetically optimized fuzzy decision ...
  • bibitem{Ref۲۲}P. Pulkkinen and H. Koivisto, textit{Fuzzy classifier identification using decision ...
  • bibitem{Ref۲۳}J. R. Quinlan, textit{C۴.۵: programs for machine learning}, Morgan Kaufmann ...
  • bibitem{Ref۲۴}L. Rokach and O. Maimon, textit{Top-down induction of decision trees ...
  • bibitem{Ref۲۵}L. Rokach and O. Maimon, textit{Data mining with decision trees: ...
  • bibitem{Ref۲۶}J. Sanz, A. Fernandez, H. Bustince and F. Herrera, textit{IIVFDT: ...
  • bibitem{Ref۲۷}M. Umano, H. Okamoto, I. Hatono, H. Tamura, F. Kawachi, ...
  • bibitem{Ref۲۸}T. Wang, Z. Li, Y. Yan and H. Chen, textit{A ...
  • bibitem{Ref۲۹}X. Wang and C. Borgelt, textit{Information measures in fuzzy decision ...
  • bibitem{Ref۳۰}X. Wang, B. Chen, G. Qian and F. Ye, textit{On ...
  • bibitem{Ref۳۱}X. Wang and H. Jiarong, textit{On the handling of fuzziness ...
  • bibitem{Ref۳۲}Y. Yuan and M. J. Shaw, textit{Induction of fuzzy decision ...
  • bibitem{Ref۳۳}M. Zeinalkhani and M. Eftekhari, textit{A new measure for comparing ...
  • bibitem{Ref۳۴}H. Zhang and B. H. Singer, textit{Recursive partitioning and applications}, ...
  • end{thebibliography} ...
  • نمایش کامل مراجع