跳到主要內容

臺灣博碩士論文加值系統

(44.211.26.178) 您好!臺灣時間:2024/06/24 21:24
字體大小: 字級放大   字級縮小   預設字形  
回查詢結果 :::

詳目顯示

我願授權國圖
: 
twitterline
研究生:簡旭佑
研究生(外文):Hsu-Yu Chien
論文名稱:運用貓群演算法於特徵值選取與支援向量機參數最佳化之研究
論文名稱(外文):A Study of Feature Selection and Parameter Determination for Support Vector Machine Based on Cat Swarm Optimization
指導教授:林冠成林冠成引用關係
指導教授(外文):Kuan-Cheng Lin
學位類別:碩士
校院名稱:國立中興大學
系所名稱:資訊管理學系所
學門:電算機學門
學類:電算機一般學類
論文種類:學術論文
論文出版年:2010
畢業學年度:98
語文別:英文
論文頁數:47
中文關鍵詞:特徵值選取貓群最佳化演算法支援向量機參數調整
外文關鍵詞:feature selectioncat swarm optimizationsupport vector machinesparameter determination
相關次數:
  • 被引用被引用:1
  • 點閱點閱:372
  • 評分評分:
  • 下載下載:0
  • 收藏至我的研究室書目清單書目收藏:0
分類問題一直以來都被廣泛的討論研究在各個領域上,例如:資料探勘、型態辨識,生物資訊學與文件分類等。不管是哪種領域,分類的目標在於建立一個最佳的分類模型將資料有效地分成兩類或多類別,並且讓人們對於各種事物的認知有合理化的解釋。而支援向量機的分類效能上有兩個因素(特徵值選取與分類器參數調整)常被討論,特徵值選取的目的在於減少特徵值個數並有效地移除不相關、雜訊、重複性的資料,除此之外,訓練分類器時參數的調整也會有效地改善分類效能。
因此,本研究提出貓群最佳化演算法結合支援向量機的分類模型來同時處理特徵值選取與分類器參數調整並有效地解決資料分類問題。藉由貓群最佳化演算法來找出最佳特徵子集合與支援向量機的核心參數來建立分類模型,並且期望在維持同樣的分類正確率之下,減少特徵值個數與分類器的計算時間,甚至能提高分類正確率。
本研究實驗使用UCI機器學習資料庫來評估貓群演算法結合支援向量機(CSO+SVM),基因演算法結合支援向量機(GA+SVM)與粒子群演算法結合支援向量機(PSO+SVM)等分類模型的分類正確率,並且在一定的時間內與GA+SVM比較分類效率,實驗的結果證明本研究所提出貓群演算法結合支援向量機的分類模型能有效地改善分類正確率與分類效率。


This research constructs the CSO+SVM model for classification through integrating cat swam optimization into SVM classifier. There are two factors (i.e. feature selection and parameter determination) of classification problems will mainly discuss in this study. The objectives of feature selection are to reduce number of features and remove irrelevant, noisy and redundant data. Besides, the parameter optimization for training can improve classification performance. Hence, the optimal feature subset and kernel parameter are applied to SVM classifier for reducing the computational time in an acceptable classification accuracy. Furthermore, the classification accuracy is increased. The different classes and types in UCI machine learning repository are used to evaluate the classification accuracy of the proposed CSO+SVM, GA+SVM and PSO+SVM methods. Experimental results show the effectiveness of the proposed CSO+SVM method for improving the classification accuracy and classification efficiency.

Table of Contents
Abstract (in Chinese) i
Abstract (in English) ii
Table of Contents iii
List of Tables v
List of Figures vi
Chapter 1. Introduction 1
1.1. Background and Motivation 1
1.2. Organization of the Thesis 4
Chapter 2. Literature Reviews 5
2.1. Support Vector Machines 5
2.1.1. Linearly Separable SVM 5
2.1.2. Linearly Non-separable SVM 8
2.1.3. Non-linearly Separable SVM 10
2.2. Feature Selection 12
2.3. Cat Swarm Optimization 13
2.3.1. Seeking Mode 14
2.3.2. Tracing Mode 20
2.3.3. Basic Process of the CSO 20
Chapter 3. The Proposed CSO+SVM Method 23
3.1. Solution Set Design 23
3.2. System Architecture 24
Chapter 4. Experimental Results 26
4.1. Accuracy 26
4.2. Time Consuming 31
4.3. CSO Parameters Settings 37
Chapter 5. Conclusions and Future Works 42
References 43


References
[1]A. N. Srivastava, R. Su and A. S. Weigend, "Data Mining for Features using Scale-sensitive Gated Experts," IEEE Transactions on Pattern Analysis and Machine Intelligence, vol. 21, pp. 1268-1279, 1999.
[2]N. Belacel and M. R. Boulassel, "Multicriteria Fuzzy Classification Procedure Profit: Methodology and Medical Application", Fuzzy Sets and Systems, vol. 141, pp. 203-217, 2004.
[3]J.K. Kishore, L. M. Patnaik, V. Mani and V. K. Agrawal, "Application of Genetic Programming for Multicategory Pattern Classification," IEEE Transactions on Evolutionary computation, vol. 4, pp. 242-257, 2000.
[4]M. Pontil and A. Verri, “Support Vector Machines for 3D Object Recognition,” IEEE Transactions on Pattern Analysis and Machine Intelligence, vol. 20, no. 6, pp. 637-646, 1998.
[5]R. Stevens, C. Goble, P. Baker, and A. Brass, “A Classification of Tasks in Bioinformatics,” Bioinformatics, vol. 17, no. 2, pp. 180-188, 2001.
[6]J. Abonyi, J. A. Roubos and F. Szeifert, "Data-driven Generation of Compact, Accurate, and Linguistically Sound Fuzzy Classifiers based on a Decision-tree Initialization," International Journal of Approximate Reasoning, vol. 32, no. 1, pp. 1-21, 2003.
[7]V. Mitra, C. J. Wang, and S. Banerjee, “Text Classification: A Least Square Support Vector Machine Approach,” Applied Soft Computing, vol. 7, no. 3, pp. 908-914, 2007.
[8]D. D. Lewis, "Naive (Bayes) at forty: The Independence Assumption in Information Retrieval " In Proceedings of ECML-98, 10th European Conference on Machine Learning, no. 1398, pp. 4-15, 1998.
[9]J. R. Quinlan, “C4.5: Programs for Machine Learning,” (Morgan Kaufmann Series in Machine Learning): Morgan Kaufmann, 1993.
[10]G. P. Zhang, “Neural Networks for Classification: A Survey,” IEEE Transactions on Systems, Man, and Cybernetics—part c: Applications and Reviews, vol. 30, no. 4, pp. 451-462, 2000.
[11]T.S. Furey, N. Cristianini, N. Duffy, D.W. Bednarski, M. Schummer, and D. Haussler, “Support Vector Machine Classification and Validation of Cancer Tissue Samples using Microarray Expression Data,” Bioinformatics, vol. 16 no. 10, pp. 906-914, 2000.
[12]J. P. Pedroso, and N. Murata, “Optimization on Support Vector Machines,” IEEE-INNS-ENNS International Joint Conference on Neural Networks, vol. 6, pp. 399-404, 2000.
[13]F. ven den Bergh and A. P. Engelbrecht, “A Cooperative Approach to Particle Swarm Optimization,” IEEE Transaction on Evolutionary Computation, vol. 8, no. 3, pp. 225-239, 2004.
[14]V. N. Vapnik, ”The Nature of Statistical Learning Theory,” Springer-Verlag, Ny, USA.
[15]V. Vapnik and C. Cortes, “Support-Vector Networks,” Machine Learning, vol. 20, no. 3, pp. 273-297, 1995.
[16]H. Fröhlich, O. Chapelle, and B. Schölkopf, “Feature Selection for Support Vector Machines by Means of Genetic Algorithms,” Proceedings of the 15th IEEE International Conference on Tools with Artificial Intelligence, pp.142-148, 2003.
[17]I. H. Osman and G. Laporte, “Metaheuristics: A Bibliography,” Annals of Operations Research, vol. 63, pp. 513–623, 1996.
[18]L. Davis, “Handbook of Genetic Algorithms,” Van Nostrand Reinhold, New York, 1991.
[19]W. S. Goldstein, “Swarm Intelligence: Focus on Ant and Particle Swarm Optimization,” I-Tech Education and Publishing, Vienna, Austria, 2007.
[20]M. Dorigo, M. Birattari, and T. Stutzle, “Ant Colony Optimization,” IEEE Computational Intelligence Magazine, pp.28-39, 2006.
[21]J. Kennedy, and R. Eberhart, “Particle Swarm Optimization,” IEEE International Conference on Neural Networks, vol. 4, pp. 1942-1948, 1995.
[22]S. C. Chu and P. W. Tsai, “Computational Intelligence Based on the Behavior of Cats,” International Journal of Innovative Computing, Informationand Control, vol. 3, no. 1, pp. 163-173, 2007.
[23]C. W. Hsu, C. C. Chang, and C. J. Lin, “A Practical Guide to Support Vector Classification,” Available at: http://www.csie.ntu.edu.tw/~cjlin/papers/guide/guide.pdf, 2003.
[24]S. Liu, C. Y. Jia and H. Ma, “A New Weighted Support Vector Machine with GA-based Parameter Selection,” Proceedings of the 4th International Conference on Machine Learning and Cybernetics, vol. 7, pp. 4351-4355, 2005.
[25]C. L. Huang and C. J. Wang, “A GA-based Feature Selection and Parameters Optimization for Support Vector Machines,” Expert Systems with Application, vol. 31, no. 2, pp. 231-240, 2006.
[26]S. W. Lin, K. C. Ying, S. C. Chen and Z. J. Lee, “Particle Swarm Optimization for Parameter Determination and Feature Selection of Support Vector Machines,” Expert Systems with Application, vol. 35, no. 4, pp. 1817-1824, 2008.
[27]C. J. C. Burges, “A Tutorial on Support Vector Machines for Pattern Recognition,” Data Mining and Knowledge Discovery, vol.2, no.2, pp.955-974, 1998.
[28]C. W. Hsu, and C. J. Lin, “A Comparision of Methods for Multiclass Support Vector Machines,” IEEE Transaction on Neural Networks, vol. 13, no. 2, pp. 415-425, 2002.
[29]R. Fletcher, “Practical Methods of Optimization,” John Wiley and Sons, Inc., 2nd Edition, 1987.
[30]M. Pardo, and G. Sberveglieri, "Classification of Electronic Nose Data with Support Vector Machines," Sensors and Actuators B Chemical, vol. 107, pp. 730-737, 2005.
[31]R. Jensen, and Q. Shen, “Computational Intelligence and Feature Selection: Rough and Fuzzy Approaches,” Wiley, 2008.
[32]H. Liu and H. Motoda, “Feature Selection for Knowledge Discovery and Data Mining,” Norwell, MA: Kluwer Academic, 1998.
[33]C. C. Chang and C. J. Lin, “LIBSVM: A Library for Support Vector Machines,” Available at: http://www.csie.ntu.edu.tw/~cjlin/libsvm.
[34]S. Hettich, C. L. Blake, and C. J. Merz, “UCI Repository of Machine Learning Databases,” Department of Information and Computer Science, University of California, Irvine, CA, Available at: http//www.ics.uci.edu/~mlearn/MLRepository.html, 1998.
[35]S. L. Salzberg, “On Comparing Classifiers: Pitfalls to Avoid and a Recommended Approach,” Sensors and Actuators B Chemical, vol.1, 1997, pp.317-327.


QRCODE
 
 
 
 
 
                                                                                                                                                                                                                                                                                                                                                                                                               
第一頁 上一頁 下一頁 最後一頁 top