Minimum Perturbation Classification: A Statistical Approach to Class Assignment Based on Distributional Stability

This With 17 Page And PDF Format Ready To Download

  • من نویسنده این مقاله هستم

استخراج به نرم افزارهای پژوهشی:

لینک ثابت به این :

Abstract:

We introduce the Minimum Perturbation Classifier (MPC), a novel classification method based on the principle that a sample belongs to the class whose statistical distribution is least perturbed by its addition. Unlike traditional distance-based or probabilistic classifiers, MPC evaluates membership by measuring changes in distributional statistics when a candidate sample is hypothetically added to each class. The classifier assigns the sample to the class exhibiting minimal statistical perturbation, operating under the intuition that true members should naturally integrate into their parent distribution without causing significant disruption. We formalize this concept through a perturbation metric that aggregates changes across multiple statistical measures including mean, variance, standard deviation, skewness, and kurtosis. Experimental evaluation on four benchmark datasets (Iris, Wine, Breast Cancer, and Digits) demonstrates that MPC achieves competitive performance with established methods including k-Nearest Neighbors, Support Vector Machines, and Naive Bayes on low-dimensional problems, with accuracy reaching 91.11% on Iris and 98.15% on Wine datasets. Our analysis reveals that MPC performs optimally when combined with variance-based statistics and sample-size normalization, particularly excelling in scenarios with 4-15 features and well-separated class distributions. However, performance degrades in high-dimensional spaces (>30 features), suggesting that the approach is most suitable for problems where distributional characteristics are the primary discriminating factors. This work contributes both a new theoretical perspective on classification through distributional stability and practical insights into the conditions under which this perspective yields competitive results.

Authors

مراجع و منابع این :

لیست زیر مراجع و منابع استفاده شده در این را نمایش می دهد. این مراجع به صورت کاملا ماشینی و بر اساس هوش مصنوعی استخراج شده اند و لذا ممکن است دارای اشکالاتی باشند که به مرور زمان دقت استخراج این محتوا افزایش می یابد. مراجعی که مقالات مربوط به آنها در سیویلیکا نمایه شده و پیدا شده اند، به خود لینک شده اند :
  • Aha, D. W., Kibler, D., & Albert, M. K. (1991). ...
  • Bishop, C. M. (2006). Pattern Recognition and Machine Learning. Springer. ...
  • Breiman, L., Friedman, J., Stone, C. J., & Olshen, R. ...
  • Cook, R. D. (1977). Detection of influential observation in linear ...
  • Cortes, C., & Vapnik, V. (1995). Support-vector networks. Machine Learning, ...
  • Cover, T., & Hart, P. (1967). Nearest neighbor pattern classification. ...
  • Dua, D., & Graff, C. (2017). UCI Machine Learning Repository. ...
  • Friedman, J. H. (1989). Regularized discriminant analysis. Journal of the ...
  • Hastie, T., Tibshirani, R., & Friedman, J. (2009). The Elements ...
  • Huber, P. J. (1981). Robust Statistics. John Wiley & Sons. ...
  • McLachlan, G. J. (1992). Discriminant Analysis and Statistical Pattern Recognition. ...
  • Murphy, K. P. (2012). Machine Learning: A Probabilistic Perspective. MIT ...
  • Raudys, Š. J., & Jain, A. K. (1991). Small sample ...
  • Rousseeuw, P. J., & Leroy, A. M. (1987). Robust Regression ...
  • Shewhart, W. A. (1931). Economic Control of Quality of Manufactured ...
  • Vapnik, V. N. (1998). Statistical Learning Theory. Wiley-Interscience. ...
  • Webb, G. I., Boughton, J. R., & Wang, Z. (2005). ...
  • نمایش کامل مراجع