跳到主要內容

臺灣博碩士論文加值系統

(216.73.216.35) 您好!臺灣時間:2025/12/17 22:35
字體大小: 字級放大   字級縮小   預設字形  
回查詢結果 :::

詳目顯示

: 
twitterline
研究生:李仁豪
研究生(外文):Lee, Jen-Hao
論文名稱:有界支向機使用RBF核心之模型選取
論文名稱(外文):Model Selection of the Bounded SVM Formulation Using the RBF Kernel
指導教授:林智仁林智仁引用關係
指導教授(外文):Lin, Chih-Jen
學位類別:碩士
校院名稱:國立臺灣大學
系所名稱:資訊工程學研究所
學門:工程學門
學類:電資工程學類
論文種類:學術論文
論文出版年:2001
畢業學年度:89
語文別:英文
論文頁數:66
中文關鍵詞:支向機模型選取
外文關鍵詞:Support Vector MachineModel SelectionLeave-one-out
相關次數:
  • 被引用被引用:1
  • 點閱點閱:457
  • 評分評分:
  • 下載下載:58
  • 收藏至我的研究室書目清單書目收藏:1
近來支向機成為機器學習領域中最受囑目的方法之一,
既有的完備理論基礎加上日漸成熟的實作技巧使支向機
有能力解決中大型的問題, 而其表現足以與
類神經網路和決策樹等主流方法批敵.
在實際應用上, 支向機有幾個參數必須設定,
因此模型選取仍是一個待解決的問題.
其目的是要調整參數以獲得較好的一般化學習效果.
本論文探討支向機的模型選取,
所要研究的支向機是一種無線性限制的變形,
並侷限於使用 RBF 核心, 而以 leave-one-out
cross validation 的結果為模型選取的依據.
首先, 我們提出一個簡單的架構: 在一個給定
的模型空間中逐一將各點的 LOO 值算出以選取
其中最好的一點.
接著使用一些實作的技巧來減少不必要的計算
並加速整個模型選取的過程. 最後根據觀察
結果提出一些經驗法則以利搜尋好的模型空間.
實驗結果顯示, 在本論文中所發展的方法無論
在效率或是學習效果方面都有不錯的結果.
而本論文所提出的觀察和經驗法則對
其他相關研究也或有助益.
The support vector machine (SVM) has become one of the most promising and
popular methods in machine learning. Sound theory and careful implementation
make SVM efficient enough to solve moderate to large problems, and
the performance has been shown to be competitive with existing methods such as
neural networks and decision trees.
One remaining problem on the practical use of SVM is the model selection.
That is, there are several parameters to tune
so that better general accuracy can be achieved.
This thesis works on the case of
an SVM classification formulation with only bounded
constraints using the RBF kernel
according to the leave-one-out (loo) rate. A simple framework
is approached in the first place, in which loo rates are exactly computed
through a given model space. Next, some tricks are utilized
to avoid unnecessary computation. Some heuristics are also proposed for
locating good areas in more efficient ways according to observations on
loo rates and time distribution over the model space.
The experiments show that the software developed here performs
well both in terms of computational time and loo rates. And the heuristics
proposed here should be helpful for other SVM model selection software.
Abstract ii
Acknowledges iii
List of Tables vi
List of Figures vii
Chapter 1 Introduction 1
1.1 Background 1
1.2 Model Selection 1
1.3 Estimators of Generalization Performance 2
1.4 Our Works 3
1.5 Overview 3
Chapter 2 Basic Concepts of SVM 5
2.1 SVM 5
2.2 BSVM 9
2.3 The Radial Basis Functions Kernel 10
2.4 Leave-one-out Cross-validation of BSVM 11
2.5 Model Selection on SVM 13
Chapter 3 Implementation Techniques 15
3.1 Related Details of BSVM Implementation 15
3.2 Baseline Implementation 17
3.2.1 Leave-one-out Cross-validation 17
3.2.2 Model Selection 19
3.3 Improvement Tricks on the Complete Grid 20
3.3.1 Using the Relation Between (D) and (Dj) 20
3.3.2 Using the Relation Between Different Models 21
3.3.3 Relaxation of the Stopping Tolerance 23
3.3.4 Experiments 23
3.4 Reducing the Search Space 30
3.4.1 The Early Termination Method 30
3.4.2 The Free-skip Method 35
Chapter 4 Observations 38
4.1 Training with Different Parameters 38
4.2 Locating the Good Region 41
4.3 Using a Smaller Training Set 48
Chapter 5 Conclusions and Discussions 55
Appendices 57
Bibliography 64
.B. The Jackknife, the Bootstrap and Other Resampling Plans.
Bristol: Arrowsmith, 1982.
B.Boser, I.Guyon, and V.Vapnik. A training algorithm for optimal margin classif
In Proceedings of the Fifth Annual Workshop on Computational
Learning Theory, 1992.
C.Campbell. An introduction to kernel methods.
In R.J. Howlett and L.C. Jain, editors, Radial basis function
networks: design and applications, Berlin, 2000. Springer Verlag.
O.Chapelle, V.Vapnik, O.Bousquet, and S.Mukherjee. Choosing kernel parameters f
Technical report, 2000.
Submitted to Machine Learning.
C.Cortes and V.Vapnik. Support-vector network.
Machine Learning, 20:273--297, 1995.
D.DeCoste and K.Wagstaff. Alpha seeding for support vector machines.
In Proceedings of International Conference on Knowledge
Discovery and Data Mining (KDD-2000), 2000.
T.-T. Friess, N.Cristianini, and C.Campbell. The kernel adatron algorithm: a fa
for support vector machines.
In Proceedings of 15th Intl. Conf. Machine Learning.
Morgan Kaufman Publishers, 1998.
C.-W. Hsu and C.-J. Lin. A simple decomposition method for support vector machi
Machine Learning, 2001.
To appear.
T.S. Jaakkola and D.Haussler. Probabilistic kernel regression models.
In Proceedings of the 1999 Conference on AI and Statistics,
1999.
T.Joachims. Making large-scale SVM learning practical.
In B.Sch\"olkopf, C.J.C. Burges, and A.J. Smola, editors,
Advances in Kernel Methods - Support Vector Learning, Cambridge, MA, 1998.
MIT Press.
T.Joachims. Estimating the generalization performance of a SVM efficiently.
In Proceedings of the International Conference on Machine
Learning, San Francisco, 2000. Morgan Kaufman.
J.C. Platt. Fast training of support vector machines using sequential minimal
optimization.
In B.Sch\"olkopf, C.J.C. Burges, and A.J. Smola, editors,
Advances in Kernel Methods - Support Vector Learning, Cambridge, MA, 1998.
MIT Press.
C.Saunders, M.O. Stitson, J.Weston, L.Bottou, B.Sch\"olkopf, and A.Smola.
Support vector machine reference manual.
Technical Report CSD-TR-98-03, Royal Holloway, University of London,
Egham, UK, 1998.
B.Sch\"olkopf, J.C. Platt, J.Shawe-Taylor, A.J. Smola, and R.C. Williamson.
Estimating the support of a high-dimensional distribution.
Technical Report 99-87, Microsoft Research, 1999.
V.Torczon and M.W. Trosset. From evolutionary operation to parallel direct sear
algorithms for numerical optimization.
Computing Science and Statistics, 29(1), 1997.
V.Vapnik. Statistical Learning Theory.
Wiley, New York, NY, 1998.
V.Vapnik and O.Chapelle. Bounds on error expectation for SVM.
In A.Smola, P.Bartlett, B.Sch\"olkopf, and D.Schuurmans,
editors, Advances in Large Margin Classifiers, pages 261--280,
Cambridge, MA, 2000. MIT Press.
C.-H. Yeang, S.Ramaswamy, P.Tamayo, S.Mukherjee, R.M. Rifkin, M.Angelo, M.Reic
Molecular classification of multiple tumor types.
Bioinformatics: Discovery Note, 1(1):1--7, 2001.
T.Joachims. The Maximum-Margin Approach to Learning Text Classifiers:
Methods, Theory, and Algorithms.
PhD thesis, Universitaet Dortmund, 2000.
S.S. Keerthi. Efficient tuning of SVM hyperparameters using radius/margin bound
and iterative algorithms.
Technical report, Department of Mechanical and Production
Engineering, National University of Singapore, Singapore, 2001.
S.S. Keerthi, C.J. Ong, and M.M. Lee. Two efficient methods for computing leave
algorithms.
Technical report, Department of Mechanical and Production
Engineering, National University of Singapore, Singapore, 2000.
M.M. Lee, S.S. Keerthi, C.J. Ong, and D.DeCoste. An efficient method for comput
vector machines.
Technical report, Department of Mechanical and Production
Engineering, National University of Singapore, Singapore, 2001.
Y.-J. Lee and O.L. Mangasarian. Rsvm: Reduced support vector machines.
In Proceedings of the First SIAM International Conference on
Data Mining, 2001.
C.-J. Lin. Formulations of support vector machines: a note from an optimization
point of view.
Neural Computation, 13(2):307--317, 2001.
A.Lunts and V.Brailovskiy. Evaluation of attributes obtained in statistical dec
Engineerinh Cybernetics, 3:98--109, 1967.
O.L. Mangasarian and D.R. Musicant. Successive overrelaxation for support vecto
IEEE Trans. Neural Networks, 10(5):1032--1037, 1999.
D.Michie, D.J. Spiegelhalter, and C.C. Taylor. Machine Learning, Neural and St
Prentice Hall, Englewood Cliffs, N.J., 1994.
Data available at anonymous ftp: ftp.ncc.up.pt/pub/statlog/.
E.Osuna, R.Freund, and F.Girosi. Training support vector machines: An applicati
In Proceedings of CVPR''97, 1997.
D.Pavlov, D.Chudova, and P.Smyth. Towards scalable support vector machines usin
In Proceedings of ACM Conference on Knowledge Discovery and Data
Mining, pages 295--299, 2000.
QRCODE
 
 
 
 
 
                                                                                                                                                                                                                                                                                                                                                                                                               
第一頁 上一頁 下一頁 最後一頁 top