|
[1] Barua, S.; Lslam, M.; Yao, X.; Murase, K. (2014), “MWMOTE—Majority Weighted Minority Oversampling Technique for Imbalanced Data Set Learning”, IEEE Transactions on Knowledge and Dara Eegineering, 26, pp. 405-425. [2] Breiman, L.; Friedman, J.; Olshen, R.; Stone, C. (1984). “Classification and Regression Trees”, Wadsworth, Belmont, CA, 1984, ISEN 0-534-98053-8. [3] Bunkhumpornpat, C. ; Sinapiromsaran K. ; Lursinsap, C. (2009). “Safe-Level-SMOTE: Safe-Level-Synthetic Minority Over-Sampling Technique for Handling the Class Imbalanced Problem”. PAKDD, LNCS, 5476, pp. 475-482. Springer, Heidelberg (2009). [4] Bunkhumpornpat, C. ; Sinapiromsaran, K.; Lursinsap, C. (2012). “DBSMOTE: Density-Based Synthetic Minority Over-sampling Technique”, Applied Intelligence, 36, pp. 664-684. [5] Chawla, N. V. ; Bowyer, K. W. ; Hall L. O. ; Kegelmeyer W. P. (2002). “SMOTE: Synthetic Minority Over-sampling Technique”. Journal of Artificial Intelligence Research, 16, pp. 321–357. [6] Chawla, N.V. ; Lazarevic, A. ; Hall, L. O. ; Bowyer, K. (2003). “SMOTEBoost: Improving Prediction of the Minority Class in Boosting”. Knowledge Discovery in Databases, pp. 107-119. [7] Chawla, N. V.; Japkowicz, N.; Kolcz, A. (2004). “Editorial: Special Issue on Learning from Imbalanced Data Sets”, Sigkdd Explorations, 6, Issue 1 pp. 1-6. [8] Fan, W.; Stolfo, S. J.; Zhang, J.; Chan, P. K. (1999). “AdaCost: Misclassification Cost-sensitive Boosting”, ICML, 99, pp. 97-105. [9] Freund, Y.; Schapire, R. E. (1996). “Experiments with a New Boosting Algorithm”. Machine Learning: Proceedings of the Thirteenth International Conference, pp. 148-156 [10] Han, H. ; Wang, W. Y. ; Mao B. H. (2005). “Borderline-SMOTE: A New Over-Sampling Method in Imbalanced Data Sets Learning”. ICIC, 3644, pp. 878-887. Springer Heidelberg. [11] He, H.; Bai, Y., Garcia, E. A.; Li, S. (2008). “ADASYN: Adaptive Synthetic Sampling Approach for Imbalanced Learning”, IEEE World Congress on Computational Intelligence, pp. 1322-1328. [12] Mustafa, G.; Niu, Z.; Yousif, A.; Tarus, J. (2015). “Solving the Class Imbalance Problems using RUSMultiBoost Ensemble”. Information Systems and Technologies (CISTI), 2015 10th Iberian Conference on, pp. 1-6. [13] Ross, Q. J. (1993). “C4.5: Programsfor Machine Learning”, Machine Learning, 16. pp 235-240. [14] Sáez, J. A.; Luengo, J.; Stefanowsk, J. ; Herrera, F. (2015). “SMOTE–IPF: Addressing the noisy and borderline examples problem in imbalanced classification by a re-sampling method with filtering”, Information Sciences, 291, pp. 184–203. [15] Seiffert, C.; Khoshgoftaar , T. M.; Hulse, J. V.; Napolitano, A. (2008).“Building Useful Models from Imbalanced Data with Sampling and Boosting”, Association for the Advancement of Artificial Intelligence, pp. 306-311. [16] Sun, Z.; Song, Q. ; Zhu, X. ; Sun, H.; Xu, B.; Zhou, Y. (2015), “A novel ensemble method for classifying imbalanced data”, Pattern Recognition, 48, pp. 1623–1637. [17] Weiss, G. M. (2004). “Mining with Rarity: A Unifying Framework”, Sigkdd Explorations 6, pp. 7-19. [18] Weiss, G. M.; McCarthy, K.; Zabar, B. (2007). “Cost-Sensitive Learning vs. Sampling: Which is Best for Handling Unbalanced Classes with Unequal Error Costs?”, DMIN, pp. 35-41. [19] Yin, Q. Y. ; Zhang, J. S. ; Zhang, C. X. ; Liu S. C. (2013). “Research Article An Empirical Study on the Performance of Cost-Sensitive Boosting Algorithms with Different Levels of Class Imbalance”. Mathematical Problems in Engineering. Article ID 761814.
|