[1]羅中育,「田口品質工程應用於模擬退火法參數組合之研究---以旅行推銷員問題(TSP)為例」,雲林科技大學工業工程與管理研究所碩士論文,民國90年.[2]胡崇銘,「以主成分分析評估基金績效與風險」,國立臺灣大學商學研究所博士論文,民國89年.[3]王亞倫,「診斷多變量管製圖之研究」,國立中央大學工業管理研究所碩士論文,民國89年.[4]陳孝先,「應用主成份得分分佈函數探勘客戶貢獻度與忠誠度之研究」,私立東海大學資訊工程與科學系碩士論文,民國94年.[5]周文賢,「多變量統計分析SAS/STAT 使用方法,智勝書局,台北,民國91年.
[6]吳智鴻,「結合基因演算法最佳化「支持向量機」參數~財務危機上之應用」,國立台北大學企業管理學系博士論文,民國93年.[7]陳芃暐,「使用基因演算法完成支援向量機的模型選擇」,國立台灣科技大學電子工程系碩士論文,民國92年.[8]王界人,「遺傳演算法應用於支援向量機之參數調整與屬性篩選」,私立華梵大學資訊管理學系碩士論文,民國94年.[9]李育昇,「支援向量機器於可靠度系統之應用」,私立大葉大學工業工程學系碩士論文,民國93年.[10]段建帆,「支援向量機之最佳化參數與屬性篩選之分散式資料探勘系統—以粒子群最佳化演算法為基礎」,私立華梵大學資訊管理學系碩士論文,民國94年.[11]林東毅, 「結合粗略集合論、支援向量機及最佳化演算法於顧客關係管理之應用」,私立大葉大學工業工程與科技管理學系碩士論文,民國95年.[12]楊舜麟,「最佳化演算法於支援向量迴歸參數選擇之分析」,私立大葉大學工業工程與科技管理學系碩士論文,民國95年.[13]黃仁澤,「對於高維度資料進行特徵選取-應用於分類蛋白質質譜儀資料」,國立政治大學統計研究所碩士論文,民國94年.[14]劉家輝,「支援向量機器理論預測DNA結合蛋白」,私立中華大學資訊工程學系碩士論文,民國93年.[15]謝忠訓,「以支援向量機結合權重投票機制之網頁分類研究」,私立朝陽科技大學資訊管理系碩士論文,民國94年.[16]鄭清俊,「應用類神經網路與支援向量機於目標客戶選取」,國立成功大學資訊管理研究所碩士論文,民國94年.[17]胡翠峰,「模糊相關與支援向量學習應用在文件多重分類問題之研究」,私立義守大學資訊管理學系碩士論文,民國93年.[18]Vapnik, V. N., The Nature of Statistical Learning Theory. New York: Springer, 1995.
[19]Cao, L. J. and Tay, F. E. H., “Support Vector Machine with Adaptive Parameters in Financial Time Series Forecasting,” IEEE Transactions on Neural Network, vol. 14, no. 6, 2003, pp. 1506-1518.
[20]Cai, Y. -D., Liu, X. -J., Xu, X. -B. and Chou, K. -C., “Prediction of Protein Structural Classes by Support Vector Machines,” Computers and Chemistry, vol. 26, 2002, pp. 293–296.
[21]Valentini, G., “Gene Expression Data Analysis of Human Lymphoma Using Support Vector Machines and Output Coding Ensembles,” Artificial Intelligence in Medicine, vol. 26, 2002, pp. 281–304.
[22]Ng, J. and Gong, S., “Composite Support Vector Machines for Detection of Faces Across Views and Pose Estimation,” Image and Vision Computing, vol. 20, 2002, pp. 359-368.
[23]Guo, Q., Kelly, M. and Graham, C. H., “Support Vector Machines for Predicting Distribution of Sudden Oak Death in California,” Ecological Modelling, vol. 182, 2005, pp. 75–90.
[24]Shin, K. -S., Lee, T. -S. and Kim, H. -J., “An Application of Support Vector Machines in Bankruptcy Prediction Model,” Expert Systems with Applications, vol. 28, 2005, pp. 127–135.
[25]Liang, J. -Z., “SVM Multi-Classifier and Web Document Classification,” Proceedings of the Third International conference on Machine Learning and Cybernetics, vol. 3, 2004, pp. 1347- 1351.
[26]Keerthi. S. S. and Lin, C. -J., “Asymptotic Behaviors of Support Vector Machines with Gaussian kernel,” Neural Computation, vol. 15, 2003, pp. 1667-1689.
[27]Jack, L. B. and Nandi, A. K., “Fault Detection Using Support Vector Machines and Artificial Neural Networks, Augmented by Genetic Algorithms,” Mechanical Systems and Signal Processing, vol. 16, 2002, pp. 373-390.
[28]Shon, T., Kim, Y., Lee, C. and Moon, J., “A Machine Learning Framework for Network Anomaly Detection Using SVM and GA,” Proceedings of IEEE Workshop on Information Assurance and Security, vol. 2, 2005, pp. 176–183.
[29]Chen, R. -C. and Hsieh, C. -H., “Web Page Classification Based on a Support Vector Machine Using a Weighed Vote Schema,” Expert Systems with Applications. (In Press).
[30]Gold, C., Holub, A. and Sollich, P., “Bayesian Approach to Feature Selection and Parameter Tuning for Support Vector Machine Classifiers,” Neural Networks, vol. 18, 2005, pp. 693–701.
[31]Pai, P. -F. and Hong, W. -C., “Support Vector Machines with Simulated Annealing Algorithms in Electricity Load Forecasting,” Energy Conversion and Management, vol. 46, 2005, pp. 2669–2688.
[32]Pai, P. -F. and Hong, W. -C., “Software Reliability Forecasting by Support Vector Machines with Simulated Annealing Algorithms,” The Journal of Systems and Software, 2005. (In Press)
[33] Samanta, B., Al-Balushi, K. R. and Al-Araimi, S. A., “Artificial neural Networks and Support Vector Machines with Genetic Algorithm for Bearing Fault Detection,” Engineering Applications of Artificial Intelligence, vol. 16, 2003, pp. 657–665.
[34]Romeijn, H. E., Zabinsky, Z. B., Graesser, D. L. and Neogi, S., “New Reflection Generator for Simulated Annealing in Mixed-Integer/Continuous Global Optimization,” Journal of Optimization Theory and Applications, vol. 101, 1999, pp. 403–427.
[35]Romeijn, H. E. and Smith, R. L., “Simulated Annealing for Constrained Global Optimization,” Journal of Global Optimization, vol. 5, 1994, pp. 101–126.
[36]Hettich, S., Blake, C. and Merz, C., (1998). UCI repository of machine information and computer sciences, Available: http://www.ics.uci.edu/~mlearn/MLRepository.html.
[37]Salzberg, S. L., “On Comparing Classifiers: Pitfalls to Avoid and a Recommended Approach,” Data Mining and Knowledge Discovery, vol. 1, 1997, pp. 317–327.
[38]Burgers, C. J. C., “A Tutorial on Support Vector Machines for Pattern Recognition,” Data Mining and Knowledge Discovery, vol. 2, 1998, pp. 121–167.
[39]Zhang, Y. L., Guo, N., Du, H. and Li, W. H., “Automated Defect Recognition of C-SAM Image in IC Packaging Using Support Vector Machines,” International Journal of Advanced Manufacturing Technology, vol. 25, 2005, pp. 1191-1196.
[40]Lin, H. -T. and Lin, C. -J., “A Study on Sigmoid Kernels for SVM and the Training of Non-PSD Kernels by SMO-type Methods,” Taiwan, Tech. Rep. (1-32), March. 2003.
[41]Müller, K. R., Mike, S., Rätsch, G., Tsuda, K. and Schölkopf, B., “An Introduction to Kernel-Based Learning Algorithms,” IEEE Transactions on Neural Networks, vol. 12, 2001, pp. 181-202.
[42]Pardo, M. and Sberveglieri, G., “Classification of Electronic Nose Data with Support Vector Machines,” Sensors and Actuators B Chemical, vol. 107, 2005, pp. 730–737.
[43]Metropolis, N., Rosenbluth, A. W., Rosenbluth, M. N., Teller, A. H. and Teller, E., “Equation of State Calculations by Fast Computing Machines,” The Journal of Chemical Physics, vol. 21, 1953, pp. 1087–1092.
[44]Kirkpatrick, S., Gelatt, C. D., Jr., and Vecchi, M. P., “Optimization by Simulated Annealing,” Science, vol. 220, 1983, pp. 671-680.
[45]Liu, H., and Motoda, H., Feature Selection for Knowledge Discovery and Data Mining, Norwell, MA: Kluwer Academic, 1998.
[46] Kohavi, R., and John, G. H., “Wrappers for Feature Subset Selection,” Artificial Intelligence, 1997, pp. 273-324.
[47]Özge, U. and Türksen, I. B., “A Novel Feature Selection Approach: Combining Feature Wrappers and Filters,” Information Sciences, 2006. (In Press)
[48]Mladenic, D. and Grobelnik, M., “Feature Selection on Hierarchy of Web Documents,” Decision Support Systems, vol. 35, 2003, pp. 45-87
[49]Acır, N., Özdamar, Ö. and Guzelis, C., “Automatic Classification of Auditory Brainstem Responses Using SVM-based Feature Selection Algorithm for Threshold Detection,” Engineering Applications of Artificial Intelligence, vol. 19, 2006, pp. 209–218.
[50]Valentini, G., Muselli, M. and Ruffino, F., “Cancer Recognition with Bagged Ensembles of Support Vector Machines,” Neurocomputing, vol. 56, 2004, pp. 461–466.
[51]Zhang, L., Jack, L. B. and Nandi, A. K., “Fault Detection Using Genetic Programming,” Mechanical Systems and Signal Processing, vol. 19, 2005, pp. 271-289.
[52]Hsu, C. -W., Chang, C. -C. and Lin, C. -J., “A Practical Guide to Support Vector Classification,” Taiwan, Tech. Rep. (1-12), July. 2003.
[53]Wang, J., Wu, X. and Zhang, C., “Support Vector Machines Based on K-means Clustering for Real-Time Business Intelligence Systems,” International Journal of Business Intelligence and Data Mining, vol. 1, 2005, pp. 54-64.
[54]Wei, Y. and Lin, C. -J., Feature Extraction, Foundations and Applications. Springer, 2005.
[55]Pai, P. -F. and Hong, W. -C., “Forecasting Regional Electricity Load Based on Recurrent Support Vector Machines with Genetic Algorithms,” Electric Power Systems Research, vol. 74, 2005, pp. 417–425.
[56]Fung, G. and Mangasarian, O. L., “Finite Newton Method for Lagrangian Support Vector Machine Classification,” Neurocomputing, vol. 55, 2003, pp. 39-55.
[57]Liao, Y., Fang, S. -C. and, Nuttle, H. L. W., “A Neural Network Model with Bounded-Weights for Pattern Classification,” Computers and Operations Research, vol. 31, 2004, pp. 1411–1426.
[58]Bennett, K. P. and Blue, J. A., “A Support Vector Machine Approach to Decision Trees,” Proceedings of IEEE World Congress on Computation Intelligence, 1997, pp. 2396-2401.