跳到主要內容

臺灣博碩士論文加值系統

(18.97.14.81) 您好!臺灣時間:2025/02/19 03:43
字體大小: 字級放大   字級縮小   預設字形  
回查詢結果 :::

詳目顯示

: 
twitterline
研究生:陳怡伶
研究生(外文):Chen, I-Ling
論文名稱:結合多重辨識和最佳化核函數之方法於支撐向量機進行高維度資料分類
論文名稱(外文):Combining Ensemble Technique of Support Vector Machines with the Optimal Kernel Method for High Dimensional Data Classification
指導教授:郭伯臣郭伯臣引用關係
指導教授(外文):Kuo, Bor-Chen
口試委員:張志永陶金旭楊晉民
口試委員(外文):Chang, Jyh-YeongTaur, Jin-ShiuhYang, Jinn-Min
口試日期:2011-07-07
學位類別:碩士
校院名稱:國立臺中教育大學
系所名稱:教育測驗統計研究所
學門:教育學門
學類:教育測驗評量學類
論文種類:學術論文
論文出版年:2011
畢業學年度:99
語文別:英文
論文頁數:67
中文關鍵詞:樣式辨識動態子空間法最佳化核函數支撐向量機
外文關鍵詞:pattern recognitiondynamic subspace methodoptimal kernelSVM
相關次數:
  • 被引用被引用:0
  • 點閱點閱:291
  • 評分評分:
  • 下載下載:75
  • 收藏至我的研究室書目清單書目收藏:0
近年來,支撐向量機與融合分類器已被廣泛且成功地運用於改善高維度資料辨識的效能並解決Hughes現象所造成的問題。許多研究證實多重辨識器系統,如隨機子空間法與動態子空間法,利用生成不同特徵子空間建構一群具有多樣性的基底辨識器,可減緩小樣本高維度的顧慮,得到比單一辨識器更好的辨識效果;此外,許多研究亦顯示支撐向量機為一種完善且有效的分類器,並且作為上述兩種多重辨識器系統中的基底辨識器,支撐向量機也可以獲得很不錯的分類正確率。然而,控制支撐向量機分類效能的主要因素為kernel function。因此,選取一個合適的kernel function或挑選適合kernel function的參數對支撐向量機而言是相當重要的。
本研究將整合上述方法的優勢,針對支撐向量機分類器,利用最佳化核函數的方法,發展一個適合高維度資料分析的多重辨識器系統,並提出一個融入最佳化核函數方法自動化挑選子空間維度數及特徵空間之多重支撐向量機。藉由一個自動化選取RBF kernel function最佳參數的方法,找出適合各維度資料進行分類的核化空間,並且在子空間選取的步驟當中引入動態子空間法的概念,加入兩個重要性密度分布函數,分別用來自動化的選取子空間維度數,以及選取該子空間的特徵,希望藉此增強已發展的動態子空間法之辨識效果。由實驗結果得知,此研究提出的方式在選取較適的kernel函數上有較佳的表現,相較於DSM而言,在縮短運算時間和提升辨識正確率之目標上,都有較為突破的效果。

In recent years, the support vector machines (SVM) and combining classifiers are widely and successfully used to improve the classification on high dimensional data and solve Hughes phenomenon. Many researches have demonstrated that multiple classifier systems or so-called ensembles can alleviate concern occurs from small sample size and high dimensionality data and obtain more outstanding and robust results than single models. Examples are the random subspace method (RSM) and dynamic subspace method (DSM) which are both effective approaches for generating an ensemble of diverse base classifiers via different feature subsets. In addition, SVM can be used as the base classifier which is considered useful and effective classifier in the two methods mentioned above to achieve higher classification accuracy rate.
However, the performance of SVM is influenced greatly based on choosing the proper kernel functions or proper parameters of a kernel function. Therefore, the objectives of this research are to develop an ensemble technique based on SVM via the optimal kernel method and propose a novel subspace selection mechanism, named the kernel-based dynamic subspace method (KDSM). KDSM combines the optimal kernel method with all superiorities of DSM that is improved on classification outcomes based on SVM. The experimental results show that the proposed method obtains sound performances than the other conventional methods; moreover, compared with the DSM, there are outstanding results not only in improving accuracy of classification but also in reducing the computation time.

CHAPTER 1: INTRODUCTION 1
1.1 Statement of Research 1
1.2 Organization of Thesis 4
1.3 Major Notation and Acronyms 6
CHAPTER 2: LITERATURE REVIEW 7
2.1 Ensemble Method 8
2.1.1 Random subspace method 11
2.1.2 Dynamic subspace method 15
2.2 Support Vector Machine 19
2.2.1 Kernel Method 20
2.2.2 SVM Algorithm 21
2.3 An Optimal Kernel Method for selecting RBF Kernel Parameter 24
CHAPTER 3: KERNEL BASED DYNAMIC SUBSPACE METHOD 27
3.1 Importance Distribution of Band Membership 29
3.2 Importance Distribution of Dimensionality Weight 33
3.3 Optimal Kernel-based Dynamic Subspace Ensemble 36
CHAPTER 4: EXPERIMENTAL DESIGN AND RESULTS 41
4.1 Experimental Design 41
4.1.1 Datasets of experiment 43
4.2 Experimental Results 48
CHAPTER 5: CONCLUSION AND FUTURE WORK 55
APPENDIX A: THE TEST OF “SECTOR” UNIT 57
REFERENCES 61


Banfield, R. E., Hall, L. O., Bowyer, K. W., & Kegelmeyer, W. P. (2007). A Comparison of Decision Tree Ensemble Creation Techniques, IEEE Transaction on Pattern Analysis and Machine Intelligence, 29(1), 173-180.
Bauer, E., & Kohavi, R. (1999). An empirical comparison of voting classi_cation algorithms: Bagging, boosting and variants. Machine Learning, 36, 105-142.
Bellman, R. E. (1961). Adaptive Control Processes. Princeton, NJ: Princeton Univ. Press.
Bernardo, J. M., & Smith, A. F. M. (1994). Bayesian theory. New York, NY: Wiley.
Bertoni, A., Folgieri, R., & Valentini, G. (2004a). Feature Selection Combined with Random Subspace Ensemble for Gene Expression based Diagnosis of Malignancies. Paper presented at the meeting of the 15th Italian Workshop on Neural Nets, Perugia, Italy.
Bertoni, A., Folgieri, R., & Valentini, G. (2004b). Random subspace ensembles for the bio-molecular diagnosis of tumors, in Proc. NETTAB Workshop on Models and Metaphors from Biology to Bioinformatics Tools, 2004.
Boser, B. E., Guyon, I. M. & Vapnik, V. N. (1992). A training algorithm for optimal margin classifiers. Proceedings of the Fifth Annual Workshop on Computational Learning Theory, 144-152.
Breiman, L. (1996). Bagging predictors. Machine Learning, 24, 123-140.
Bruzzone, L. & Persello, C. (2009). A Novel Context-Sensitive Semisupervised SVM Classifier Robust to Mislabeled Training Samples, IEEE Trans. Geosci. Remote Sens., 47(7), 2142-2154.
Buntine, W. (1990). A theory of learning classi_cation rules. Doctoral dissertation, School of Computing Science, University of Technology, Sydney, Australia.
Camps-Valls, G. & Bruzzone, L. (2005). Kernel-based methods for hyperspectral image classification, IEEE Trans. Geosci. Remote Sens., 43 (6), 1351–1362.
Camps-Valls, G. & Bruzzone, L. (2009). Kernel Methods for Remote Sensing Data Analysis. John Wiley & Sons, Ltd.
Camps-Valls, G., Gómez-Chova, L., Calpe, J., Soria, E., Martín, J. D., Alonso, L. & Moreno, J. (2004). Robust support vector method for hyperspectral data classification and knowledge discovery, IEEE Trans. Geosci. Remote Sens., 42 (7), 1530–1542.
Camps-Valls, G., Gomez-Chova, L., Munoz-Mari, J., Rojo-Alvarez, J. L. & Martinez-Ramon, M. (2008). Kernel-based framework for multitemporal and multisource remote sensing data classification and change detection, IEEE Trans. Geosci. Remote Sens., 46 (6), 1822–1835.
Chang, C.C. & Lin, C.J. (2001). LIBSVM: A Library for Support Vector Machines, 2001. Software available at http://www.csie.ntu.edu.tw/~cjlin/libsvm
Chen, B., Liu, H. & Bao, Z. (2008). Optimizing the Data-dependent Kernel under A Unified Kernel Optimization Framework, Pattern Recognition, 41(6), 2107-2119.
Chen, I.-L., Li, C.-H., Kuo, B.-C. & Huang, H.-Y. (2010). Applying Optimal Algorithm to Data-dependent Kernel for Hyperspectral Image Classification, Proceedings of International Geosciences and Remote Sensing Symposium, 2808-2811
Christopher, J. W. (2004). Hyperspectral image classification with limited trainingdata samples using feature subspaces, Proc. SPIE, 5425, 170–181.
Chuang, C.-H., Kuo, B.-C. & Wang, H.-P. (2008). Fuzzy fusion method for combining small number of classifiers in hyperspectral image classification, Proc. 8th Int. Conf. Intell. Syst. Des. Appl., 1, 26–28.
Cortes, C., & Vapnik, V. (1995). Support vector networks. Machine Learning, 20, 273–297.
Dietterich, T. G. (2000). Ensembles Methods in Machine Learning. Lecture Notes in Computer Science, 1857, 1-15.
Drucker, H., Cortes, C., Jackel, L. D., LeCun, Y., & Vapnik, V. (1994). Boosting and other machine learning algorithms. Proceedings of the Eleventh International Conference on Machine Learning, 53-61.
Freund, Y., & Schapire, R. E. (1996). Experiments with a new boosting algorithm. Proceedings of the Thirteenth International Conference on Machine Learning, 148-156.
Fukunaga, K. (1990). Introduction to Statistical Pattern Recognition. (2nd ed.). San Diego, CA: Academic.
Ham, J., Chen, Y., Crawford, M. M. & Ghosh, J. (2005). Investigation of the random forest framework for classification of hyperspectral data, IEEE Trans. Geosci. Remote Sens., 43(3), 492–501.
Ho, T. K. (1998a). The Random Subspace Method for Constructing Decision Forests. IEEE Transaction on Pattern Analysis and Machine Intelligence, 20(8), 832-844.
Ho, T. K. (1998b). Nearest Neighbors in Random Subspaces. Proceedings of 2nd International Workshop Statistical Techniques in Pattern Recognition, 640-648.
Hughes, G. F. (1968). On the Mean Accuracy of Statistical Pattern Recognition. IEEE Transaction on Information Theory, IT-14, 55-63.
John, S. T. & Nello, C. (2004). Kernel Methods for Pattern Analysis. Cambridge University Press.
Kong, E. B., & Dietterich, T. G. (1995). Error-correcting output coding corrects bias and variance. Proceedings of the Twelfth International Conference on Machine Learning, 313-321.
Kuncheva, L. I. (2004). Combining Pattern Classifiers: Methods and Algorithms. Hoboken, NJ: Wiley & Sons.
Kuo, B.-C., Hsieh, Y.-C., Liu, H.-C., & Chao, R.-M. (2005). A Random Subspace Method with Automatic Dimensionality Selection for Hyperspectral Image Classification. Proceedings of International Geoscience and Remote Sensing Symposium, 25-29.
Kuo, B.-C., Pai, C.-H., Sheu, T.-W., & Chen, G.-S. (2004). Hyperspectral Data Classification using Classifier Overproduction and Fusion Strategies, Proceedings of IEEE International Geoscience and Remote Sensing Symposium, 5, 2937-2940.
Landgrebe, D.A. (2003), Signal Theory Methods in Multispectral Remote Sensing, John Wiley and Sons, Hoboken, NJ: Chichester.
Li, C.-H., Lin, C.-T., Kuo, B.-C. & Chu, H.-S. (2010). An automatic method for selecting the parameter of the RBF kernel function to support vector machines, Proceedings of International Geosciences and Remote Sensing Symposium, 836-839.
Maclin, R., & Opitz, D. (1997). An empirical evaluation of bagging and boosting. Proceedings of the Fourteenth National Conference on Arti_cial Intelligence. Providence, RI: AAAI Press.
Melgani, F. & Bruzzone, L. (2004). Classification of hyperspectral remote sensing images with support vector machines. IEEE Trans. Geosci. Remote Sens.,42 (8), 1778–1790.
Neal, R. M. (1993). Probabilistic inference using Markov chain Monte Carlo methods (Technical Report CRG-TR-93-1). Department of Computer Science, University of Toronto, Toronto, Canada.
Parzen, E. (1962). On Estimation of a Probability Density Function and Mode. Annals of Mathematical Statistics, 33(3), 1065-1076.
Quinlan, J. R. (1996). Bagging, boosting, and C4.5. Proceedings of the Thirteenth National Conference on Arti_cial Intelligence, 725-730.
Raudys, S. J. & Jain, A. K. (1991). Small Sample Size Effects in Statistical Pattern Recognition: Recommendations for Practitioners. IEEE Transaction on Pattern Recognition and Machine Intelligence, 13(3), 252-264.
Silverman, B. W. (1985). Density Estimation for Statistics and Data Analysis, London, UK: Chapman & Hall.
Skurichina, M. & Duin, R. P. W. (2001). Bagging and the Random Subspace Method for Redundant Feature Spaces. Proceeding of 2nd International Workshop Multiple Classifier Systems, 1-10.
Skurichina, M. & Duin, R. P. W. (2002). Bagging, Boosting and the Random Subspace Method for Linear Classifiers. Pattern Analysis and Applications, 5(2), 121-135.
Sun, S., Zhang, C., & Zhang, D. (2007). An Experimental Evaluation of Ensemble Methods for EEG Signal Classification. Pattern Recognition Letters, 28(15), 2157-2163.
Tao, D., Tang, X., Li, X., & Wu, X. (2006). Asymmetric Bagging and Random Subspace for Support Vector Machines-based Relevance Feedback in Image Retrieval. IEEE Transactions on Pattern Analysis and Machine Intelligence, 28(7), 1088-1099.
Tarabalka, Y., Benediktsson, J. A. & Chanussot, J. (2009). Spectral–Spatial Classification of Hyperspectral Imagery Based on Partitional Clustering Techniques. IEEE Trans. Geosci. Remote Sens., 47(8), 2973-2987.
Xiong, H. L., Swamy, M. N. S., Omair, M. & Ahmad. (2005). Optimizing the kernel in the empirical feature space, IEEE Trans. Neural Networks 16 (2) 460–474.
Yang, J-M., Kuo, B-C., Yu,P-T. & Chuang, C-H. (2010). A Dynamic Subspace Method for Hyperspectral Image Classification, IEEE Transaction on Geosciences and Remote Sensing, 48(7), 2840-2853.

QRCODE
 
 
 
 
 
                                                                                                                                                                                                                                                                                                                                                                                                               
第一頁 上一頁 下一頁 最後一頁 top
無相關期刊