臺灣博碩士論文加值系統

(44.200.77.92) 您好！臺灣時間：2024/03/01 09:22

:::

詳目顯示

:
 Twitter

• 被引用:2
• 點閱:236
• 評分:
• 下載:26
• 書目收藏:0
 目前解支向機問題主要是用分解式方法. 這方法的重要關鍵是在工作集合的選取. 在此論文的第一部份我們從整個有界支向機分解方法之設計過程及實驗分析的結果而提出一個簡單的選取方法, 這方法會使困難問題的收斂變快. 根據不同類型問題的實驗結果數據顯示所提出之方法是可行的. 論文的第二部份中我們關心多類支向機之分解方法. 原來的支向機只能做兩類別的分類問題, 如何使支向機可以有效地延伸到多類分類問題仍然是目前的研究重點. 很多方法被提出來: 其中有些人是把很多的雙類支向機整合成一個多類支向機, 其他人則是一次就考慮所有的類別. 但是解決多類分類問題所需的計算量龐大, 關於在大型問題上的比較並沒有被仔細的探討. 尤其是第二種方法需要解一個更大的最佳化問題, 以前的實驗都只限在小型問題上. 本論文中我們提出兩種一次解決式方法之分解方法實作. 然後將之與其他三種合併式方法: 一對全部, 一對一和直接不循環圖支向機來比較. 結果顯示一對一和直接不循環圖支向機較適合於實際情況使用, 另外也顯示一次解決式方法的支持向量較少.
 The decomposition method is currently one of the major methods for solving support vector machines (SVM). An important issue of this method is the selection of working sets. In the first part of this thesis through the design of decomposition methods for bound-constrained SVM formulations and from the experimental analysis we propose a simple selection of the working set which leads to faster convergences for difficult cases. Numerical experiments on different types of problems are conducted to demonstrate the viability of the proposed method. The second part of this thesis focuses on decomposition methods for multi-class SVM. As SVM was originally designed for binary classification, how to effectively extend it for multi-class classification is still an on-going research issue. Several methods have been proposed where typically we construct a multi-class classifier by combining several binary classifiers. Some authors also proposed methods that consider all classes of data at once. As it is computationally more expensive on solving multi-class problems, comparisons on these methods using large-scale problems have not been seriously conducted. Especially for methods solving multi-class SVM in one step, a much larger optimization problem is required so up to now experiments are limited to small data sets. In this thesis we give decomposition implementation for two such ``all-together" methods: \cite{VV98a,JW98a} and \cite{KC00a}. We then compare their performance with three methods based on binary classification: ``one-against-all,'' ``one-against-one,'' and DAGSVM \cite{JP00a}. Our experiments indicate that the ``one-against-one'' and DAG methods are more suitable for practical use than the other methods. Results also show that for large problems methods by considering all data at once in general need fewer support vectors.
 ABSTRACT ii ACKNOWLEDGEMENTS iii LIST OF FIGURES vi LIST OF TABLES vii CHAPTER I. Introduction 1 II. The SVM Problem 5 2.1 Basic Concepts of SVM 5 2.2 Decompositon Algorithms 9 III. Selection of the Working Set 12 IV. Computational Experiments on Algorithm III.1 23 4.1 Numerical Experiments on BSVM 23 4.2 Using Algorithm III.1 for Standard SVM Formulation 31 V. Five Methods for Multi-class SVM 33 5.1 One-against-all Method 33 5.2 One-against-one Method 34 5.3 DAGSVM Method 35 5.4 An Method by Considering All Data at Once And A Decomposition Implementation 36 5.5 Method by Crammer and Singer 43 VI. Numerical Experiments on Multi-class SVM 47 6.1 Data and Implementation 47 6.2 Results and Discussions 49 VII. Conclusions and Discussions 59
 K.P. Bennett, D.Hui, and L.Auslender. On support vector decision trees for database marketing.Department of Mathematical Sciences Math Report No. 98-100,Rensselaer Polytechnic Institute, Troy, NY 12180, Mar. 1998.C.L. Blake and C.J. Merz. UCI repository of machine learning databases.Technical report, University of California, Department of Informationand Computer Science, Irvine, CA, 1998.Available athttp://www.ics.uci.edu/""mlearn/MLRepository.html.B.Boser, I.Guyon, and V.Vapnik. A training algorithm for optimal margin classifiers.In Proceedings of the Fifth Annual Workshop on ComputationalLearning Theory, 1992.L.Bottou, C.Cortes, J.Denker, H.Drucker, I.Guyon, L.Jackel, Y.LeCun, U.Muller,E.Sackinger, P.Simard, and V.Vapnik.Comparison of classifier methods: a case study in handwriting digitrecognition.In International Conference on Pattern Recognition, pages77--87. IEEE Computer Society Press, 1994.E.J. Bredensteiner and K.P. Bennett. Multicategory classification by support vector machines.Computational Optimizations and Applications, pages 53--79,1999.M.P.S. Brown, W.N. Grundy, D.Lin, N.Cristianini, C.Sugnet, T.S. Furey, J.M.Ares, and D.Haussler.Knowledge-based analysis of microarray gene expression data usingsupport vector machines.PNAS, 97(1):262--267, 2000.C.J.C. Burges. A tutorial on support vector machines for pattern recognition.Data Mining and Knowledge Discovery, 2(2):121--167, 1998.C.-C. Chang, C.-W. Hsu, and C.-J. Lin. The analysis of decomposition methods forsupport vector machines.IEEE Trans. Neural Networks, 11(4):1003--1008, 2000.C.-C. Chang and C.-J. Lin. LIBSVM: a library for support vector machines, 2001.Software available athttp://www.csie.ntu.edu.tw/""cjlin/libsvm.K.K. Chin. Support vector machines applied to speech pattern classification.Master's thesis, University of Cambridge, 1998.C.Cortes and V.Vapnik. Support-vector network.Machine Learning, 20:273--297, 1995.K.Crammer and Y.Singer. On the learnability and design of output codes for multiclassproblems.In Computational Learing Theory, pages 35--46, 2000.K.Crammer and Y.Singer. Ultraconservative online algorithms for multiclass problems.Technical report, School of Computer Science and Engineering, HebrewUniversity, 2001.N.Cristianini and J.Shawe-Taylor. An Introduction to Support Vector Machines.Cambridge University Press, Cambridge, UK, 2000.D.DeCoste and B.Sch\"olkopf. Training invariant support vector machines.Machine Learning, 2001.To appear.R.Fletcher. Practical Methods of Optimization.John Wiley and Sons, 1987.J.Friedman. Another approach to polychotomous classification.Technical report, Department of Statistics, Stanford University,1996.Available at http://www-stat.stanford.edu/reports/friedman/poly.ps.Z.T.-T. Friess, N.Cristianini, and C.Campbell. The kernel adatron algorithm: a fast and simple learning procedurefor support vector machines.In Proceedings of 15th Intl. Conf. Machine Learning.Morgan Kaufman Publishers, 1998.Y.Guermeur. Combining discriminant models with new multiclass svms.Neuro COLT Technical Report NC-TR-00-086, LORIA Campus Scientifique,2000.T.K. Ho and E.M. Kleinberg. Building projectable classifiers of arbitrary complexity.In Proceedings of the 13th International Conference onPattern Recognition, pages 880--885, Vienna, Austria, August 1996.C.-W. Hsu and C.-J. Lin. A comparison on methods for multi-class support vectormachines.Technical report, Department of Computer Science and InformationEngineering, National Taiwan University, Taipei, Taiwan, 2001.C.-W. Hsu and C.-J. Lin. A simple decomposition method for support vector machines.Machine Learning, 2001.To appear.T.Joachims. Making large-scale SVM learning practical.In B.Sch\"olkopf, C.J.C. Burges, and A.J. Smola, editors,Advances in Kernel Methods - Support Vector Learning, Cambridge, MA, 1998.MIT Press.T.Joachims. Transductive inference for text classification using support vectormachines.In Proceedings of International Conference on Machine Learning,1999.T.Joachims, 2000. Private communication.J.Kindermann, E.Leopold, and G.Paass. Multi-class classification with error correcting codes.In E.Leopold and M.Kirsten, editors, Treffen der GI-Fachgruppe1.1.3, Maschinelles Lernen, 2000.GMD Report 114.S.Knerr, L.Personnaz, and G.Dreyfus. Single-layer learning revisited: a stepwiseprocedure for buildingand training a neural network.In J.Fogelman, editor, Neurocomputing: Algorithms,Architectures and Applications. Springer-Verlag, 1990.U.Kreel. Pairwise classification and support vector machines.In B.Sch\"olkopf, C.J.C. Burges, and A.J. Smola, editors,Advances in Kernel Methods --- Support Vector Learning, pages 255--268,Cambridge, MA, 1999. MIT Press.P.Laskov. An improved decomposition algorithm for regression support vectormachines.In Workshop on Support Vector Machines, NIPS99, 1999.Y.LeCun, L.Jackel, L.Bottou, A.Brunot, C.Cortes, J.Denker, H.Drucker, I.Guyon,U.Muller, E.Sackinger, P.Simard, and V.Vapnik.Comparison of learning algorithms for handwritten digit recognition.In F.Fogelman and P.Gallinari, editors, International Conferenceon Artificial Neural Networks, pages 53--60, Paris, 1995. EC2 \& Cie., 1995.C.-J. Lin. On the convergence of the decomposition method for support vectormachines.Technical report, Department of Computer Science and InformationEngineering, National Taiwan University, Taipei, Taiwan, 2000.To appear in IEEE Transactions on Neural Networks.C.-J. Lin. Formulations of support vector machines: a note from an optimizationpoint of view.Neural Computation, 13(2):307--317, 2001.C.-J. Lin. Stopping criteria of decomposition methods for support vectormachines: a theoretical justification.Technical report, Department of Computer Science and InformationEngineering, National Taiwan University, Taipei, Taiwan, 2001.C.-J. Lin and J.J. Mor\'e. Newton's method for large-scale bound constrained problems.SIAM J. Optim., 9:1100--1127, 1999.Software available at http://www.mcs.anl.gov/""more/tron.O.L. Mangasarian and D.R. Musicant. Successive overrelaxation for support vectormachines.IEEE Trans. Neural Networks, 10(5):1032--1037, 1999.N.Matic, I.Guyon, J.Denker, and V.Vapnik. Writer adaptation for on-line handwritten character recognition.In I.C.S. Press, editor, In Second International Conference onPattern Recognition and Document Analysis, pages 187--191, Tsukuba, Japan,1993.E.Mayoraz and E.Alpaydin. Support vector machines for multi-class classification.In IWANN (2), pages 833--842, 1999.D.Michie, D.J. Spiegelhalter, and C.C. Taylor. Machine Learning, Neural and Statistical Classification.Prentice Hall, Englewood Cliffs, N.J., 1994.Data available at anonymous ftp: ftp.ncc.up.pt/pub/statlog/.K.-R. M\"uller, A.Smola, G.R\"atsch, B.Sch\"ol\-kopf, J.Kohlmorgen, and V.Vapnik.Predicting time series with support vector machines.In B.Sch\"olkopf, C.J.C. Burges, and A.J. Smola, editors,Advances in Kernel Methods - Support Vector Learning, pages 243--254,Cambridge, MA, 1999. MIT Press.P.M. Murphy and D.W. Aha. UCI repository of machine learning databases.Technical report, University of California, Department of Informationand Computer Science, Irvine, CA, 1994.Data available athttp://www.ics.uci.edu/""mlearn/MLRepository.html.E.Osuna, R.Freund, and F.Girosi. Support vector machines: Training and applications.AI Memo 1602, Massachusetts Institute of Technology, 1997.E.Osuna, R.Freund, and F.Girosi. Training support vector machines: An application to face detection.In Proceedings of CVPR'97, 1997.C.Papageorgiou, M.Oren, and T.Poggio. A general framework for object detection.In International Conference on Computer Vision ICCV'98, 1998.J.C. Platt. Fast training of support vector machines using sequential minimaloptimization.In B.Sch\"olkopf, C.J.C. Burges, and A.J. Smola, editors,Advances in Kernel Methods - Support Vector Learning, Cambridge, MA, 1998.MIT Press.J.C. Platt, N.Cristianini, and J.Shawe-Taylor. Large margin DAGs for multiclassclassification.In Advances in Neural Information Processing Systems,volume12, pages 547--553. MIT Press, 2000.M.J.D. Powell. On search directions for minimization.Math. Programming, 4:193--201, 1973.M.Rychetsky, S.Ortmann, and M.Glesner. Construction of a support vector machinewith local experts.In Workshop on Support Vector Machines at the InternationalJoint Conference on Artificial Intelligence (IJCAI 99), 1999.C.Saunders, M.O. Stitson, J.Weston, L.Bottou, B.Sch\"olkopf, and A.Smola.Support vector machine reference manual.Technical Report CSD-TR-98-03, Royal Holloway, University of London,Egham, UK, 1998.B.Sch\"olkopf, C.J.C. Burges, and A.J. Smola, editors. Advances in Kernel Methods - Support Vector Learning.MIT Press, Cambridge, MA, 1998.M.O. Stitson, A.Gammerman, V.Vapnik, V.Vovk, C.Watkins, and J.Weston. Support vector regression with ANOVA decomposition kernels.In Sch\"olkopf etal. BS98a, pages 285--292.R.Vanderbei. LOQO: An interior point code for quadratic programming.Technical Report SOR 94-15, Statistics and Operations Research,Princeton University, 1994.revised November, 1998.V.Vapnik. The Nature of Statistical Learning Theory.Springer-Verlag, New York, NY, 1995.V.Vapnik. Statistical Learning Theory.Wiley, New York, NY, 1998.J.Weston, 2001. Private communication.J.Weston and C.Watkins. Multi-class support vector machines.Technical Report CSD-TR-98-04, Royal Holloway, 1998.A.Zien, G.R\"atsch, S.Mika, B.Sch\"olkopf, C.Lemmen, A.Smola, T.Lengauer, and K.-R. M\"uller.Engineering support vector machine kernels that recognize translationinitiation sites.In German Conference on Bioinformatics, 1999.
 電子全文
 國圖紙本論文
 推文當script無法執行時可按︰推文 網路書籤當script無法執行時可按︰網路書籤 推薦當script無法執行時可按︰推薦 評分當script無法執行時可按︰評分 引用網址當script無法執行時可按︰引用網址 轉寄當script無法執行時可按︰轉寄

 1 結合One-Class與Multi-ClassSVM之入侵偵測系統 2 支援向量機運用於網路入侵偵測系統及訓練樣本篩選之研究 3 以SVM為基礎的文件階層式多元分類 4 低階多項式資料映射與支持向量機

 無相關期刊

 1 有界支向機使用RBF核心之模型選取 2 新支向機之研究 3 資料採礦的實例研究 4 支向機參數選取之討論 5 支向機的簡約技術 6 應用MPEG-4靜態材質壓縮技巧之可調節式視訊壓縮技術 7 具服務轉換功能之頻寬管理系統設計與實作 8 以統計量測為基礎之交易資料集分群 9 快速小波轉換之研究 10 音樂信號上的數位浮水印 11 以重力理論為基礎的階層式分群演算法之研究 12 多對多封包多點傳輸中避免封包重複之研究 13 兩個支向機用於資料採礦之實例探討 14 支向機於生物資訊上之應用 15 環場影像引導之視訊追蹤及其在增添式實境之應用

 簡易查詢 | 進階查詢 | 熱門排行 | 我的研究室