跳到主要內容

臺灣博碩士論文加值系統

(44.192.92.49) 您好!臺灣時間:2023/06/08 07:39
字體大小: 字級放大   字級縮小   預設字形  
回查詢結果 :::

詳目顯示

: 
twitterline
研究生:黃郁翔
研究生(外文):Yu-Siang Houng
論文名稱:結合線性獨立與錯誤估計改善支撐向量機器之線上學習效率
論文名稱(外文):Using Linear Independence and Error Estimation for Online Support Vector Machine
指導教授:姚志佳姚志佳引用關係
指導教授(外文):Chih-Chia Yao
學位類別:碩士
校院名稱:朝陽科技大學
系所名稱:資訊工程系碩士班
學門:工程學門
學類:電資工程學類
論文種類:學術論文
論文出版年:2013
畢業學年度:101
語文別:中文
論文頁數:71
中文關鍵詞:權重最小平方支撐向量機器支撐向量機器線上即時學習最小平方支撐向量機器
外文關鍵詞:Weighted Least Squares Support Vector MachinesLeast Squares Support Vector MachinesSupport Vector MachinesOn-line Learning
相關次數:
  • 被引用被引用:0
  • 點閱點閱:213
  • 評分評分:
  • 下載下載:15
  • 收藏至我的研究室書目清單書目收藏:0
本篇論文提出針對支撐向量機器之線上即時學習(On-line Learning)演算法,根據線性獨立判斷式篩選支撐向量,透過LS-SVM(Least Square Support Vector Machines)快速求得權重值後計算總錯誤量,依據總錯誤量判斷是否啟動學習機制來抑制總錯誤量。學習機制是修改WLS-SVM(Weighted Least Square Support Vector Machines)用來挑選錯誤量最低的資料加到支撐向量集合,再回到LS-SVM計算,直到錯誤量低於門檻值。由使用MATLAB環境之實驗結果可知,本論文提出之LIEEOSVM(Using Linear Independence and Error Estimation for Online Support Vector Machine),相較於IncrSVM、LIBSVM、POLSVM、OLSVM與Online方法,使用UCI與IDA標準資料庫,在本論文的實驗中可大幅降低所需要的支撐向量數和維持一定的準確率,並能減少所需要的計算時間。
This thesis presents a novel on line learning algorithm for support vector machines. This algorithm contains three stages learning mechanism. In first stage the property of linear independent is used to extract the support vector. Then, in second stage the parameters of support vector machines are quickly obtained and the amount of classification error is calculated by using Least Square Support Vector Machines. The third learning stage recalculates the parameter of support vector machines to improve the performance of support vector machines. However, the third stage is started depending on whether the amount of classification error is exceed the threshold or not. In third stage part of the training patterns are selected as support vector and the model of support vector machines is re-established by using Weighted Least Square Support Vector Machines. Experimental results prove that our proposed algorithm significantly reduce the required number of support vectors and maintain a certain accuracy.
摘要 I
Abstract II
致謝 III
目錄 IV
表目錄 VI
圖目錄 VII
第一章、 序論 1
1.1 前言 1
1.2 研究動機與目的 2
1.3 論文方法簡介 4
1.4 論文架構 5
第二章、 知識背景 6
2.1 支撐向量機器 6
2.1.1 線性分割 8
2.1.2 線性不可分割 13
2.1.3 非線性可分割 17
2.2 最小平方支撐向量機器 20
2.3 權重最小平方支撐向量機器 22
第三章、 結合線性獨立與錯誤估計改善支撐向量機器 24
3.1 運用線性獨立選取支撐向量 25
3.2 LS-SVM快速預估錯誤量 28
3.3 WLS-SVM學習機制 30
3.3.1 支撐向量調整演算法 33
第四章、 實驗與結果 40
4.1 實驗設定 40
4.2 實驗結果 44
第五章、 結論 67
參考文獻 68



表目錄
表4-1 林智仁教授提供的標準資料庫詳細資料表 .............................. 41
表4-2 IDA 標準資料庫詳細資料表 ....................................................... 41
表4-3 使用IDA 其中八個資料庫的準確率與支撐向量數 ................. 55
表4-4 訓練/測試準確率與CPU 計算時間的比較 ............................... 65



圖目錄
圖2-1 SVM 的超平面示意圖 .................................................................... 8
圖2-2 SVM 分類結果圖 ............................................................................ 9
圖2-3 SVM 目標函式加入鬆弛變數的超平面示意圖 ......................... 13
圖2-4 空間映射示意圖........................................................................... 18
圖3-1 三階段線上學習演算法的流程圖 .............................................. 29
圖4-1 棋盤資料分佈圖........................................................................... 46
圖4-2 棋盤資料之訓練資料(75%)分佈圖 ............................................ 46
圖4-3 棋盤資料之測試資料(25%)分佈圖 ............................................ 47
圖4-4 第1~100 筆支撐向量挑選分佈圖 .............................................. 47
圖4-5 第101~200 筆支撐向量挑選分佈圖 .......................................... 48
圖4-6 第201~300 筆支撐向量挑選分佈圖 .......................................... 48
圖4-7 第301~400 筆支撐向量挑選分佈圖 .......................................... 49
圖4-8 第401~500 筆支撐向量挑選分佈圖 .......................................... 49
圖4-9 第501~600 筆支撐向量挑選分佈圖 .......................................... 50
圖4-10 第601~700 筆支撐向量挑選分佈圖 ........................................ 50
圖4-11 第701~750 筆支撐向量挑選分佈圖 ........................................ 51
圖4-12 棋盤資料訓練過程之支撐向量成長曲線圖 ............................ 51
圖4-13 棋盤資料訓練過程之準確率曲線圖 ........................................ 52
圖4-14 Diabetes 資料庫支撐向量成長曲線圖 ...................................... 52
圖4-15 Adult7 資料庫支撐向量成長曲線圖 ......................................... 53
圖4-16 Banana 訓練過程之準確率曲線圖 ............................................ 56
圖4-17 Banana 訓練過程之支撐向量成長曲線圖 ................................ 56
圖4-18 Breast 訓練過程之準確率曲線圖 .............................................. 57
圖4-19 Breast 訓練過程之支撐向量成長曲線圖 .................................. 57
圖4-20 Diabetis 訓練過程之準確率曲線圖 ........................................... 58
圖4-21 Diabetis 訓練過程之支撐向量成長曲線圖 ............................... 58
圖4-22 German 訓練過程之準確率曲線圖 ........................................... 59
圖4-23 German 訓練過程之支撐向量成長曲線圖 ............................... 59
圖4-24 Heart 訓練過程之準確率曲線圖 ............................................... 60
圖4-25 Heart 訓練過程之支撐向量成長曲線圖 ................................... 60
圖4-26 Ringnorm 訓練過程之準確率曲線圖 ........................................ 61
圖4-27 Ringnorm 訓練過程之支撐向量成長曲線圖 ............................ 61
圖4-28 Twonorm 訓練過程之準確率曲線圖 ......................................... 62
圖4-29 Twonorm 訓練過程之支撐向量成長曲線圖 ............................. 62
圖4-30 Waveform 訓練過程之準確率曲線圖 ....................................... 63
圖4-31 Waveform 訓練過程之支撐向量成長曲線圖 ........................... 63
[1]林育利,使用類神經網路結合支撐向量機之分類器研究,碩士論文,國立中央大學光機電工程研究所,2008。
[2]J. Ozols and A. Borisov, “Fuzzy classification based on pattern projections analysis,” Pattern Recognition, vol.34, no 4, pp. 763-781, Apr. 2001.
[3]C.-L. Huang and C.-J. Wang, “A GA-based feature selection and parameters optimization for support vector machines,” Expert Systems with Applications, vol. 31, no. 2, pp. 231-240, Aug. 2006.
[4]V. N. Vapnik, Statistical Learning Theory, Wiley, New York, 1998.
[5]Y.-J. Lee and O. L. Mangasarian, “SSVM: A Smooth Support Vector Machine for Classification,” Computational Optimization and Applications, vol.20, no. 1, pp. 5-22, 2001.
[6]G. Li, C. Wen, G.-B. Huang and Y. Chen, “Error tolerance based support vector machine for regression,” Neurocomputing, vol. 74, no. 5, pp. 771-782, Feb. 2011.
[7]M. Karasuyama, and I. Takeuchi, “Multiple Incremental Decremental Learning of Support Vector Machines,” IEEE Trans. on Neural Networks, vol. 21, no. 7, pp. 1048-1059, Jul. 2010.
[8]G. Cauwenberghs, T. Poggio, “Incremental and decremental support vector machine learning,” Advances in Neural Information Processing Systems, pp. 409-415, 2000.
[9]J. Kivnen, A. Smola and R. Williamson, “Online learning with kernels,” IEEE Trans. on Signal Processing, vol. 52, no. 8, pp. 2165-2176, 2004.
[10]T. Gal, Postoptimal Analysis, Parametric Programming, and Related Topics, Berlin, New York: Walter de Gruyter, 1995.
[11]T. Hastie, S. Rosset, R. Tibshirani and J. Zhu, “The entire regularization path for the support vector machine,” J. Mach. Learning Res., vol. 5, pp. 1391-1415, 2004.
[12]F. Orabona, C. Castellini, B. Caputo, L. Jie and G. Sandini, “On-line independent support vector machines,” Pattern Recognition, vol. 42, pp. 1402-1412, Apr. 2010.
[13]J.A.K. Suykens and J. Vandewalle, “Least squares support vector machine classifiers,” Neural Processing Letters, vol. 9, no. 3, pp. 293-300, 1999.
[14]J. A. K. Suykens, J. D. Brabanter, L. Lukas and J. Vandewalle, “Weight least squares support vector machines: robustness and sparse approximation,” Neurocomputing, vol. 48, no. 1-4, pp. 85-105, Oct. 2002.
[15]H. W. Kuhn and A. W. Tucker, Nonlinear programming, Proceedings of 2nd Berkeley Symposium, Berkeley: University of California Press. pp. 481–492, 1951.
[16]W. Karush, “Minima of Functions of Several Variables with Inequalities as Side Constraints,” M.Sc. Dissertation, Dept. of Mathematics, Univ. of Chicago, Chicago, Illinois, 1939.
[17]C. J. C. Burges, “A tutorial on support vector machines for pattern recognition,” Data Mining and Knowledge Discovery, vol. 2, no. 2, pp. 121-167, 1998.
[18]V. Vapnik, The nature of statistical learning theory, Springer-Verlag, New-York, 1995.
[19]N. Cristianini and J. Shawe-Taylor, An Introduction to Support Vector Machines: Cambridge Univ. Press, 2000.
[20]A. Navia-Vazquez, R. Diaz-Morales, “Fast error estimation for efficient support vector machine growing,” Neurocomputing, vol. 73, no. 4-6, pp. 1018-1023, Jan. 2010.
[21]A. Navia-Vazquez, F. Perez-Cruz, A. Artes-Rodriguez, AR. Figueiras-Vidal, “Weighted least squares training of support vector classifiers leading to compact and adaptive schemes,” IEEE Trans Neural Netw, vol. 12, no. 5, pp.1047-1059, Sep. 2001.
[22]A. Navia-Vazquez, “Compact multi-class support vector machine,” Neurocomputing, vol. 71, no. 1-3, pp.400-405, Dec. 2007.
[23]A. Navia-Vazquez, E. Parrado-Hernandez, I. Mora-Jimenez, J. Arenas-Garca, A. R. Figueiras-Vidal, “Growing support vector classifiers with controlled complexity,” Pattern Recognition, vol. 36, no. 7, pp. 1479-1488, July 2003.
[24]Y. Engel, S. Mannor, R. Meir, “The kernel recursive least-squares algorithm,” IEEE Transactions on Signal Processing, vol. 52, no. 8, pp. 2275-2285, Aug. 2004.
[25]G. Cauwenberghs and T. Poggio, “Incremental and decremental support vector machine learning,” Advances in Neural Information Processing Systems, pp. 409–415, 2000.
[26]C.-C. Chang and, C.-J. Lin, LIBSVM: a library for support vector machines, software available at: 〈http://www.csie.ntu.edu.tw/~cjlin /libsvm/〉, 2001.
[27]G. Ratsch, Benchmark repository, Technical Report, Intelligent Data Analysis Group, Fraunhofer-FIRST, available at:〈http://ida.first. fraunhofer.de/raetsch〉, 2005.
[28]H. Duana, X. Shaob, W. Houa, G. Hea, Q. Zenga, “An incremental learning algorithm for Lagrangian support vector machines”, Pattern Recognition Letters, vol. 30, no. 15, pp. 1384–1391, Nov. 2009
[29]G. Cauwenberghs, T. Poggio, “Incremental and decremental support vector machine learning”, Adv. Neural Information Processing, vol. 13, 2001.
QRCODE
 
 
 
 
 
                                                                                                                                                                                                                                                                                                                                                                                                               
第一頁 上一頁 下一頁 最後一頁 top