跳到主要內容

臺灣博碩士論文加值系統

(3.236.124.56) 您好!臺灣時間:2021/07/30 05:44
字體大小: 字級放大   字級縮小   預設字形  
回查詢結果 :::

詳目顯示

: 
twitterline
研究生:陳俊維
研究生(外文):Chun-WeiChen
論文名稱:基於串聯式特徵選取及支援向量機之快速分類法
論文名稱(外文):SVM Based Fast Classification Using Cascade Feature Selection
指導教授:謝明得謝明得引用關係
指導教授(外文):Ming-Der Shieh
學位類別:碩士
校院名稱:國立成功大學
系所名稱:電機工程學系碩博士班
學門:工程學門
學類:電資工程學類
論文種類:學術論文
論文出版年:2012
畢業學年度:100
語文別:英文
論文頁數:48
中文關鍵詞:特徵選取支援向量機
外文關鍵詞:feature selectionSVMsupport vector machine
相關次數:
  • 被引用被引用:0
  • 點閱點閱:193
  • 評分評分:
  • 下載下載:21
  • 收藏至我的研究室書目清單書目收藏:0
支援向量機(support vector machine)是一個前瞻的邊界最大化分類器,而此方法目前亦已被應用在很多領域中。然而實現分類器硬體時需考慮一個很重要的問題為儲存支援向量(support vector)的記憶體無法限制在一定的大小下,此記憶體大小與支援向量的總數有關,且支援向量總數最高可達整個訓練資料庫的資料數目。在不同應用領域中,訓練資料庫大小往往差異很大,即便在相同領域(如影像處理),其資料顆粒可小至以像素為單位,或大到以影片為單位;而資料顆粒與訓練資料取得難易度成正比,其難易度與最終所使用的訓練資料庫亦有關。
雖然已有很多研究著重在降低支援向量總數,但大部份方法仍無法維持其分類準確率。在本論文中,我們提出以串聯式特徵選取(cascade feature selection)為基礎的方式來減少支援向量總數,在訓練資料庫子集合中利用數個線性分類器將整體複雜度降低,進而達到減少支援向量的效果。由實驗結果可證實本論文所提出的演算法可有效地降低支援向量的總數量,同時其分類準確率亦可以逼近傳統輻狀基底函數支援向量機(radial basis function SVM)。

Support vector machine (SVM) is a state-of-art large margin classifier that has been applied in many applications. The main issue of developing SVM hardware classifiers is its unlimited support vector memory. The memory size depends on the number of support vectors, which are upper-bounded by the number of training samples. The size of training dataset varies with different applications. Even in the same application like image processing, data granularity is quite different for pixel and video sequence levels. Data granularity is positively proportional to the difficulty of data collection; the difficulty is related to the training dataset size.
Many techniques have been proposed to reduce the number of support vector; however, most of them may lead to the degradation in classification accuracy. In this work, we proposed a novel support vector reduction method using cascade feature selection. The complexity is reduced by applying several linear classifiers in dataset segments. Simulation results demonstrate that the proposed algorithm not only reduce the number of support vectors, but also has a comparable accuracy with that of traditional radial basis function (RBF) classifiers.

Contents vi
Chapter 1 Introduction 1
1.1 Motivation 1
1.2 Thesis organization 2
Chapter 2 Background 3
2.1 SVM algorithm 3
2.2 Support vector reduction methods 6
2.2.1 Pre-training method 6
2.2.2 Inter-training method 8
2.2.3 Post-training method 11
2.2.4 Performance consideration 14
Chapter 3 Proposed high accuracy support vector reduction algorithm 16
3.1 Cascade feature selection 16
3.2 Data ambiguity relaxation 22
3.2.1 Elements of support vector 22
3.2.2 Sample ambiguity and feature space projection 23
3.3 Efficient dataset segmentation 26
3.4 Cost function aggregation 30
3.4.1 Ambiguity cost 31
3.4.2 Accuracy cost 31
3.4.3 Segmentation cost 32
Chapter 4 Cost evaluation and simulation results 34
4.1 Support vector memory cost 34
4.2 Complexity of different kernel functions 36
4.3 Simulation results 38
4.3.1 Artificial dataset 39
4.3.2 Real dataset 42
Chapter 5 Conclusions and future works 46
5.1 Conclusions 46
5.2 Future works 46
Reference 47

[1]C. J. C. Burges, “A tutorial on support vector machines for pattern recognition, Data Mining and Knowledge Discovery, vol. 2, no. 2, pp. 121-167, 1998.
[2]G. Bakir, L. Bottou, and J. Weston, “Breaking svm complexity with cross-training, in Proc. The 17th Annual Conference on Neural Information Processing Systems (NIPS), pp. 81-88, 2004.
[3]G. Chen, J. Xu, and X. Xiang, “Neighborhood preprocessing svm for large-scale data set classification, in Proc. The 5th International Conference on Fuzzy Systems and Knowledge Discovery, pp. 245-249, 2008.
[4]Q. A. Tran, Q. L. Zhang, and X. Li, “Reduce the number of support vectors by using clustering techniques in Proc. The 2nd International Conference on Machine Learning and Cybernetics, pp. 1245-1248, 2003.
[5]K. M. Lin and C. J. Lin, “A study on reduced support vector machines, IEEE Tran. Neural Networks, vol. 14, no. 6, pp. 1449-1459, 2003.
[6]Y. J. Lee and O. L. Mangasarian, “RSVM: reduced support vector machines, in Proc. The 1st SIAM International Conference on Data Mining, pp. 332-349, 2001.
[7]J. Suykens and J. Vandewalle, “Least square support vector machine classifiers, Neural Processing Letters, vol. 9, no.3, pp. 293-300, 1999.
[8]O. L. Mangasarian and D. R. Musicant, “Lagrangian support vector machines, Journal of Machine Learning Research, vol. 1, pp. 161-177, 2001.
[9]E. Osuna, R. Freund, and F. Girosi, “Training support vector machines: an application to face detection, in Proc. IEEE Conference on Computer Vision and Pattern Recognition (CVPR), pp. 130-136, 1997.
[10]T. Joachims, “Making large-scale SVM learning practical, in Advances in Kernel Methods – Support Vector Learning, B. Schölkopf, C. J. C. Burges, and A. J. Smola (editors), Cambridge, MA: MIT Press, pp. 169-184, 1998.



[11]J. C. Platt, “Fast training of support vector machines using sequential minimal optimization, in Advances in Kernel Methods – Support Vector Learning, B. Schölkopf, C. J. C. Burges, and A. J. Smola (editors), Cambridge, MA: MIT Press, pp. 185-208, 1998.
[12]T. Habib, G. Mercier, and J. Chanussot, “Support vector reduction in SVM algorithm for abrupt change detection in remote sensing, IEEE Geoscience and Remote Sensing letters, vol. 6, no. 3, pp. 606-610, 2009.
[13]T. Downs, K. Gates, and A. Masters, “Exact simplification of support vector solutions, Journal of Machine Learning Research, vol. 2, pp. 293-297, 2001.
[14]C. Chang and C. Lin, “LIBSVM: a library for support vector machines, available: http://www.csie.ntu.edu.tw/cjlin/libsvm, 2001.
[15]A. Jain, M. Murty, and P. Flynn, “Data clustering: a review, ACM, the Association for Computing Machinery, Computing Surveys., vol. 31, no. 3, pp. 264-323, 1999.
[16]S. C. Johnson, “Hierarchical clustering schemes, Psychometrika, vol. 32, pp. 241-254, 1967.
[17]J. Macqueen, “Some methods for classification and analysis of multivariate observations, in Proc. The 5th Berkeley Symposium on Mathematical Statistics and Probability, vol. 1, pp. 281-296, 1967.

連結至畢業學校之論文網頁點我開啟連結
註: 此連結為研究生畢業學校所提供,不一定有電子全文可供下載,若連結有誤,請點選上方之〝勘誤回報〞功能,我們會盡快修正,謝謝!
QRCODE
 
 
 
 
 
                                                                                                                                                                                                                                                                                                                                                                                                               
第一頁 上一頁 下一頁 最後一頁 top
無相關期刊