|
1.Bauer, E., & Kohavi, R. (1999). An empirical comparison of voting classification algorithms: Bagging, boosting, and variants. Machine learning, 36(1-2), 105-139. 2.Breiman, L. (1996). Bagging predictors. Machine learning, 24(2), 123-140. 3.Frank, E., Wang, Y., Inglis, S., Holmes, G., & Witten, I. H. (1998). Using model trees for classification. Machine learning, 32(1), 63-76. 4.Frank, E., & Witten, I. H. (1998). Generating accurate rule sets without global optimization. Retrieved from https://hdl.handle.net/10289/1047 5.Freund, Y., & Schapire, R. E. (1996). Experiments with a new boosting algorithm. Paper presented at the Icml. 6.Han, J., Kamber, M., & Tung, A. K. (2001). Spatial clustering methods in data mining. Geographic data mining and knowledge discovery, 188-217. 7.Hosmer Jr, D. W., Lemeshow, S., & Sturdivant, R. X. (1989). Applied logistic regression (Vol. 398): John Wiley & Sons. 8.Kohavi, R., & Kunz, C. (1997). Option decision trees with majority votes. Paper presented at the ICML. 9.Maclin, R., & Opitz, D. (1997). An empirical evaluation of bagging and boosting. AAAI/IAAI, 1997, 546-551. 10.Platt, J. C. (1999). 12 fast training of support vector machines using sequential minimal optimization. Advances in kernel methods, 185-208. 11.Quinlan, J. R. (1992). Learning with continuous classes. Paper presented at the 5th Australian joint conference on artificial intelligence. 12.Quinlan, J. R. (1993). C4. 5: programs for machine learning: Elsevier. 13.張復喻. (2014). 資料前處理:整合補值法與樣本選取之研究.中央大學. Available from Airiti AiritiLibrary database. (2014年)
|