跳到主要內容

臺灣博碩士論文加值系統

(216.73.216.11) 您好!臺灣時間:2025/09/23 16:51
字體大小: 字級放大   字級縮小   預設字形  
回查詢結果 :::

詳目顯示

我願授權國圖
: 
twitterline
研究生:梁杰榮
研究生(外文):Kit-weng Leong
論文名稱:分類問題之研究-以複數型模糊類神經系統為方法
論文名稱(外文):A Study on Classification Problem using Complex Neuro-Fuzzy Approach
指導教授:李俊賢李俊賢引用關係
指導教授(外文):ChunShien Li
學位類別:碩士
校院名稱:國立中央大學
系所名稱:資訊管理學系
學門:電算機學門
學類:電算機一般學類
論文種類:學術論文
論文出版年:2015
畢業學年度:103
語文別:中文
論文頁數:92
中文關鍵詞:特徵選取分類資訊理論複數型模糊集合
外文關鍵詞:feature selectionclassificationinformation theorycomplex fuzzy
相關次數:
  • 被引用被引用:0
  • 點閱點閱:342
  • 評分評分:
  • 下載下載:0
  • 收藏至我的研究室書目清單書目收藏:0
  本研究提出一個複數型模糊類神經系統 (Complex neuro-fuzzy system, CNFS)和採用以資訊理論 (Information theory)為基礎的特徵選取方法應用於分類問題。特徵選取方面以資訊理論為基礎,透過結合最小冗餘和最大相關的概念尋找最佳的特徵子集合。CNFS分類器的建模過程分成結構學習階段和參數學習階段。結構學習階段採用格狀分割法 (Grid partitioning method),為CNFS分類器挑選重要的模糊規則。參數學習階段使用粒子群演算法 (Particle swarm optimization, PSO)和遞迴式最小平方估計器 (Recursive least squares estimator, RLSE)分別調整模型的前鑑部參數 (Premise parameters)和後鑑部參數 (Consequent parameters),稱為PSO-RLSE複合式學習演算法,這方法能使模型在建模過程中迅速收斂,達到快速學習的效果。本研究提出的CNFS分類器結合複數型模糊集合 (Complex fuzzy sets, CFSs)和自適應類神經模糊推理系統的架構(Adaptive neuro-fuzzy inference system, ANFIS),能增加模型的非線性映射能力和提供更靈活的架構。本研究使用美國加州大學爾灣分校 (University of California-Irvine)的機器學習資料庫中十個來自不同領域的資料集來驗證本研究提出的方法,並與其他分類器比較。實驗結果顯示,本研究提出的方法在不同領域的分類問題有優秀的表現。
We present a complex neuro-fuzzy system (CNFS) as a pattern classifier that utilizes complex fuzzy sets. For feature selection of training samples, we consider the removal of redundant and irrelevant features by which we aspire to improve the predictive accuracy of the classifier. Based on information theory, we employ a well-known feature selection method that combines minimal redundancy and maximal relevance for feature selection. One crucial problem for fuzzy-rule based model construction is that the amount of data is usually large in volume, which would make the consequence part parameters of rule base grow exponentially. A modified grid-partitioning method that can select portioned area of input space if some rule-firing-strength threshold is satisfied is employed to deal with that major problem. For the parameter learning method, the particle swarm optimization algorithm (PSO) and the recursive least-squares estimator (RLSE) are integrated as a hybrid learning method to adjust the free parameters of the CNFS effectively. We conducted experiments using 10 data sets of various fields and made performance comparison with other classifiers. The experimental results demonstrate that our approach can find smaller size feature subset with high classification accuracy.
中文摘要 I
英文摘要 II
誌謝 III
目錄 IV
圖目錄 VI
表目錄 VIII
符號列表 X
第一章 緒論 1
1.1 研究背景 1
1.2 研究動機和目的 1
1.3 研究方法概述 3
1.4 論文架構 4
第二章 文獻探討 5
2.1 特徵選取 5
2.1.1 熵 5
2.1.2 條件熵 5
2.1.3 互資訊 6
2.2 基於最小冗餘和最大相關的特徵選取 7
2.2.1 最大相關性 7
2.2.2 最小冗餘性 7
2.2.3 最小冗餘最大相關特徵選取方法 8
2.3 分類器模型 9
2.3.1 模糊集合 9
2.3.2 模糊集合的因由 9
2.3.3 模糊集合的定義 10
2.3.4 複數型模糊集合 11
2.3.5 複數型模糊類神經推理系統 13
第三章 系統學習策略 17
3.1 結構學習階段 17
3.2 參數學習階段 19
3.2.1 粒子群最佳化演算法 19
3.2.2 遞迴最小平方估計器 21
3.2.3 PSO-RLSE複合式學習演算法 22
第四章 實驗 25
4.1 UCI資料集 25
4.2 實驗環境和前置參數之設定 34
4.3 評估CNFS分類器使用不同數量的選取特徵之效能 35
4.4 CNFS分類器與其他分類器比較 42
第五章 討論與結論 45
5.1 mRMR特徵選取方法之探討 45
5.2 複數型模糊集合應用於分類問題 45
5.3 複數型模糊類神經系統應用於分類問題 46
5.4 PSO-RLSE複合式學習演算法之應用 46
5.5 結論 46
第六章 未來研究方向 48
6.1 特徵選取方法之改良 48
6.2 格狀分割法之改良 48
6.3 PSO-RLSE複合式學習演算法之改良 48
參考文獻 50
附錄一 58
附錄二 71

[1] P.N. Tan, M. Steinbach and V. Kumar, “Introduction to data mining,” Addison-Wesley, 2006.
[2] L. Jiang, Z Cai, D Wang and S. Jiang “Survey of improving k-nearest-neighbor for classification,” Fuzzy Systems and Knowledge Discovery, vol. 1, pp. 679-683, August 2007.
[3] V. Vapnik, “The nature of statistical learning theory,” springer, 2000.
[4] Y. Freund and R.E. Schapire, “A decision-theoretic generalization of on-line learning and an application to Boosting,” Journal of Computer and System Sciences, vol. 55, no. 1, pp. 119–139, August 1997.
[5] T.K. Ho, “Random decision forest,” Proceedings of the Third International Conference on Document Analysis and Recognition, vol. 1, pp. 278-282, August 1995.
[6] R. Karchin, K. Karplus and D. Haussler, “Classifying G-protein coupled receptors with support vector machines,” Bioinformatics, vol. 18, no. 1, pp. 147-159, January 2002.
[7] S.M. Odeh, “Using an adaptive neuro-fuzzy inference system (ANFIS) algorithm for automatic diagnosis of skin cancer,” Journal of Communication and Computer, vol. 8, no. 9, pp. 751-755, September 2011.
[8] H. Mamitsuka, “Selecting features in microarray classification using ROC curves, Pattern Recognition, vol. 39, no. 12, pp. 2393-2404, December 2006.
[9] J. Huang, X. Shao and H. Wechsler, “Face pose discrimination using support vector machines (SVM),” Proceedings of the 14th International Conference on Pattern Recognition, vol. 1, pp. 154-156, 1998.
[10] H. Peng, F. Long and C. Ding, “Feature selection based on mutual information: criteria of max-dependency, max-relevance, and min-redundancy,” IEEE Transactions on Pattern Analysis and Machine Intelligence, vol. 27, no. 8, pp. 1226-1238, August 2005.
[11] C. Ding and H. Peng, “Minimum redundancy feature selection from microarray gene expression data,” Journal of Bioinformatics and Computational Biology, vol. 3, no. 2 pp. 185-205, April 2005.
[12] A.L. Blum and P. Langley, “Selection of relevant features and examples in machine learning,” Artificial Intelligence, vol. 97, no. 1-2, pp. 245–271, December 1997.
[13] E.P. Xing, M.I. Jordan and R.M. Karp. “Feature selection for high-dimensional genomic microarray data,” International Conference on Machine Learning, vol. 1, pp. 601-608, 2001.
[14] J. Jäger, R. Sengupta, and W.L. Ruzzo, “Improved gene selection for classification of microarrays,” Pacific Symposium on Biocomputing. vol. 8, pp. 53-64, January 2003.
[15] N. Kwak and C.H. Choi, “Input feature selection by mutual information based on Parzen window,” IEEE Transactions on Pattern Analysis and Machine Intelligence, vol. 24, no. 12, pp. 1667-1671, December 2002.
[16] E. Youn, L. Koenig, M.K. Jeong and S.H. Baek “Support vector-based feature selection using Fisher’s linear discriminant and support vector machine,” Expert Systems with Applications, vol. 37, no.9, pp. 6147-6156, 1994.
[17] J.S.R. Jang, C.T. Sun and E. Mizutani, “Neuro-fuzzy and soft computing: a computational approach to learning and machine intelligence,” 1997.
[18] D. Moses, O. Degani, H.N. Teodorescu, M. Friedman and A. Kandel, “Linguistic coordinate transformations for complex fuzzy sets,” Fuzzy Systems Conference Proceedings, vol. 3, pp. 1340-1345, August 1999.
[19] D. Ramot and M. Friedman, “Complex fuzzy sets,” IEEE Transactions on Fuzzy Systems, vol. 10, no. 2, pp. 171-186, April 2002.
[20] D. Ramot, M. Friedman, G. Langholz and A. Kandel, “Complex fuzzy logic,” IEEE Transactions on Fuzzy Systems, vol. 11, no. 4, pp. 450–461, August 2003.
[21] H. Ishibuchi, T. Nakashima and T. Morisawa, “Voting in fuzzy rule-based systems for pattern classification problems,” Fuzzy Sets and Systems vol. 103, no. 2, pp. 223-238, April 1999.
[22] H. Ishibuchi, T. Nakashima and T. Murata, “A fuzzy classifier system that generates fuzzy if-then rules for pattern classification problems,” Evolutionary Computation, IEEE International Conference, vol. 2, pp. 759-764, 1995.
[23] O. Cordon, M.J. del Jesus and F. Herrera, “A proposal on reasoning methods in fuzzy rule-based classification systems,” International Journal of Approximate Reasoning, vol. 20, no. 1, pp. 21-45, January 1999.
[24] C. Li, and T.W. Chiang, “Complex fuzzy computing to time series prediction a multi-swarm PSO learning approach,” Intelligent Information and Database Systems, Springer Berlin Heidelberg, vol. 6592, pp. 242-251, April 2011.
[25] C. Li, and T.W. Chiang, “Complex neuro-fuzzy ARIMA forecasting—A new approach using complex fuzzy sets,” IEEE Transactions on Fuzzy Systems, vol. 21, no. 3, pp. 567-584, June 2013.
[26] C. Li, and T.W. Chiang, “Complex neuro-fuzzy self-learning approach to function approximation,” Intelligent Information and Database Systems Lecture Notes in Computer Science, vol. 5991, pp. 289-299, March 2010.
[27] C. Li and F.T. Chan, “Knowledge discovery by an intelligent approach using complex fuzzy sets,” Intelligent Information and Database Systems Lecture Notes in Computer Science, vol. 7196, pp. 320-329, March 2012.
[28] W.A. Farag, V.H. Quintana and G. Lambert-Torres, “A genetic-based neuro-fuzzy approach for modeling and control of dynamical systems,” IEEE transactions on Neural Networks, vol. 9, no. 5, pp.756-767, September 1999
[29] M. Hall, “Correlation-based feature selection for machine learning,” The University of Waikato, 1999.
[30] T.M. Cover and J.A. Thomas, “Entropy, relative entropy and mutual information,” Elements of Information Theory, pp. 12-49, 1991.
[31] G.H. John and P. Langley, “Estimating continuous distributions in Bayesian classifiers,” Uncertainty in Artificial Intelligence (UAI)'95 Proceedings of the Eleventh conference on Uncertainty in Artificial Intelligence, pp. 338-345, 1995.
[32] N. Kwak and C.H. Choi, “Input feature selection for classification problems,” IEEE Transactions on Neural Networks, vol.13, no. 1, pp. 143-159, January 2002.
[33] R. Battiti, “Using mutual information for selecting features in supervised neural net learning,” IEEE Transactions on Neural Networks, vol. 5 no. 4, pp. 537-550, July 1994.
[34] A.M. Fraser and H.L. Swinney, “Independent coordinates for strange attractors from mutual information,” Physical review A, vol. 33, no. 2, pp. 1134, February 1986.
[35] C. Ding and H. Peng, “Minimum redundancy feature selection from microarray gene expression data,” Journal of Bioinformatics and Computational Biology, vol. 3, no. 2 pp. 185-205, April 2005.
[36] T.M. Cover, “The best two independent measurements are not the two best,” IEEE Transaction Systems on Man and Cybernetics, vol. 4, no. 1, pp. 116-117, January 1974.
[37] A. Jain and D. Zongker, “Feature selection: Evaluation, application, and small sample performance,” IEEE Transactions on Pattern Analysis and Machine Intelligence, vol. 19, no. 2, pp. 153-158, February 1997.
[38] H. Peng, F. Long and C. Ding, “Feature selection based on mutual information: criteria of max-dependency, max-relevance, and min-redundancy,” IEEE Transactions on Pattern Analysis and Machine Intelligence, vol. 27, no. 8, pp. 1226-1238, August 2005.
[39] M.T. Hagan, H.B. Demuth and M. Beale, “Neural Network Design,” ISBN 0-534-94332-2, 1996.
[40] S.L. Chiu, “Fuzzy model identification based on cluster estimation,” Journal of Intelligent and Fuzzy System, vol. 2, no. 3, pp. 267-278, 1994.
[41] S. Chopra, R. Mitra and V. Kumar, “Reduction of fuzzy rules and membership functions and its application to fuzzy PI and PD type controllers ,” International Journal of Control, Automation and Systems, vol. 4, no. 4, pp. 438-447, August 2006.
[42] S.L. Chiu, “Extracting fuzzy rules from data for function approximation and pattern classification”, Fuzzy Information Engineering: a Guide Tour of Applications, John Wiley&Sons, pp. 149–162, 1997.
[43] A. Hinneburg and D.A. Keim, “Optimal grid-clustering: Towards breaking the curse of dimensionality in high-dimensional clustering,” Proceeding of the 25th International Conference on Very Large Databases, pp. 506-517, 1999.
[44] M. Hall, E. Frank, G. Holmes, B. Pfahringer, P. Reutemann and H. Ian, “The WEKA Data Mining Software: An Update,” SIGKDD Explorations, vol. 11, no. 1, 2009.
[45] M. Lichman, “UCI Machine Learning Repository,” [http://archive.ics.uci.edu/ml], Irvine, CA: University of California, School of Information and Computer Science, 2013.
[46] J.J. Rodriguez, L.I. Kuncheva, and C.J. Alonso, “Rotation forest: A new classifier ensemble method,” IEEE Transactions on Pattern Analysis and Machine Intelligence, vol. 28, no. 10, pp. 1619-1630, October 2006.
[47] O. Pujol and D. Masip, “Geometry-based ensembles: Toward a structural characterization of the classification boundary,” IEEE Transactions on Pattern Analysis and Machine Intelligence, vol. 31, no. 6, pp. 1140-1146, June 2009.
[48] B. Cao, D. Shen, J.T. Sun, Q. Yang and Z. Chen, “Feature Selection in a kernel Space,” Proceedings of the 24th International Conference on Machine Learning, pp. 121-128, 2007.
[49] J. Kennedy and R. Eberhart, “Particle swan optimization,” Proceedings of IEEE International Conference on Neural Networks IV, pp. 1942-1948, 1995.
[50] C. Li and T. Wu, “Adaptive fuzzy approach to function approximation with PSO and RLSE,” Expert Systems with Applications, vol. 38, no. 10, pp. 13266-13273, September 2011.
[51] C. Li and J.W. Hu, “A new ARIMA-based neuro-fuzzy approach and swarm intelligence for time series forecasting,” Engineering Applications of Artificial Intelligence, vol. 25, no. 25 pp. 295-308, March 2012.
[52] S. Chopra, R. Mitra and V. Kumar, “Reduction of fuzzy rules and membership functions and its application to fuzzy PI and PD type controllers ,” International Journal of Control, Automation and Systems, vol. 4, no. 4, pp. 438-447, August 2006.
[53] Y. LeCun, J.S. Denker and S.A. Solla, “Optimal brain damage,” Neural Information Processing Systems, vol. 89, 1989.
[54] D. Koller and M. Sahami, “Toward optimal feature selection,” In Proceedings of the Thirteenth International Conference on Machine Learning, pp. 284–292, 1996.
[55] L.A. Zadeh, “Fuzzy Sets,” Information & Control, vol. 8, no. 3, pp. 338-353, November 1995.
[56] J.S.R. Jang, “ANFIS: Adaptive-network-based fuzzy inference system,” IEEE Transactions on Man and Cybernetics, vol. 23, no. 3, pp. 665–685, June 1993.

連結至畢業學校之論文網頁點我開啟連結
註: 此連結為研究生畢業學校所提供,不一定有電子全文可供下載,若連結有誤,請點選上方之〝勘誤回報〞功能,我們會盡快修正,謝謝!
QRCODE
 
 
 
 
 
                                                                                                                                                                                                                                                                                                                                                                                                               
第一頁 上一頁 下一頁 最後一頁 top
無相關期刊