跳到主要內容

臺灣博碩士論文加值系統

(44.200.77.92) 您好!臺灣時間:2024/02/27 06:29
字體大小: 字級放大   字級縮小   預設字形  
回查詢結果 :::

詳目顯示

: 
twitterline
研究生:王俊凱
研究生(外文):Jiun-Kai Wang
論文名稱:最小群聚結構鑑別之自我調節群聚演算法及其應用
論文名稱(外文):A Self-Regulating Clustering Algorithm for Identification of Minimal Cluster Configuration and Its Applications
指導教授:王振興王振興引用關係
指導教授(外文):Jeen-Shing Wang
學位類別:碩士
校院名稱:國立成功大學
系所名稱:電機工程學系碩博士班
學門:工程學門
學類:電資工程學類
論文種類:學術論文
論文出版年:2004
畢業學年度:92
語文別:英文
論文頁數:54
中文關鍵詞:函數近似徑向基類神經網路群聚演算法
外文關鍵詞:function approximationradial basis function networksclustering algorithm
相關次數:
  • 被引用被引用:1
  • 點閱點閱:149
  • 評分評分:
  • 下載下載:24
  • 收藏至我的研究室書目清單書目收藏:1
  群聚分析被認為是找出大量未分類資料內部的架構。在執行群聚分析時,最大的問題在於如何決定資料內部群組的數目。為了解決此類問題,已有大量的研究投入於研發群聚演算法使其具有系統化的機制自動地找出資料內部的群聚架構。從群聚演算法的文獻探討中,這些有效的機制可被分為成長、合併及分裂等三類。在群聚分析的過程中,每個機制擁有其各別的重要性並且扮演著不同的角色在找出群聚架構的資訊。本論文提出了一個自我調節群聚演算法,此演算法能夠在沒有任何先驗知識的條件下確認出資料的群聚結構。將成長、合併和分裂三個機制伴隨著即時操作功能的特性整合成自我調節群聚演算法。這些機制可以幫助自我調節群聚演算法確認出一個合適的資料群聚結構。除此之外,本論文也提出一個新的想法,稱之為群聚邊界估測,此方法能夠有效地利用精簡的高維度橢圓形狀邊界結構來表示最後群聚結果。邊界估測的方法是利用虛擬的群聚寬度和調節向量使得自我調節群聚演算法能夠找出接近實際情況的簡潔群聚結構。在最後利用兩個不同的應用,包括常被引用的測試範例以及利用徑向基類神經網路來實現函數近似的例子,來驗證自我調節群聚演算法的效能。
  Clustering analysis can be considered as finding a structure in a collection of unlabeled data. One substantial problem in performing a cluster analysis is to determine the number of clusters in the data set. To solve such a problem, a great amount of research effort has been directed to equip clustering algorithms with systematic frameworks to automatically reveal the cluster configuration of the data set. From a review of the clustering literature, these effective frameworks can be classified into growing, merging or splitting mechanisms. Each of these mechanisms possesses its own significance and plays an important role in revealing the information of cluster configuration during clustering analysis. In this study, a self-regulating clustering algorithm (SRCA) has been developed to provide an alternative solution for identifying the cluster configuration without a priori knowledge regarding the given data set. Growing, merging and splitting mechanisms with online operation capability are integrated into the proposed SRCA. These mechanisms enable the SRCA to identify a suitable cluster configuration. In addition, a novel idea for cluster boundary estimation has been proposed to effectively maintain the resultant clusters with compact hyper-elliptic-shaped boundaries. A virtual cluster spread coupled with a regulating vector enables the proposed SRCA to reveal the compact cluster configuration which may close to the real one if it exists. Finally, the effectiveness of the proposed SRCA was validated by two applications—benchmark examples and function approximation problems using radial basis function networks.
CHINESE ABSTRACT i
ABSTRACT ii
LIST OF TABLES v
LIST OF FIGURES vi

1 Introduction 1-1
1.1 Motivation 1-1
1.2 Literature Survey 1-2
1.3 Purpose of the Study 1-3
1.4 Organization of the Thesis 1-4

2 Clustering Algorithms 2-1
2.1 Hierarchical Clustering Algorithms 2-1
2.2 Partitioning Clustering Algorithms 2-4
2.2.1 Static Adaptation Methods 2-5
2.2.2 Dynamic Adaptation Methods 2-6
2.2.2.1 Growing Partitioning Algorithms 2-7
2.2.2.2 Shrinking Partitioning Algorithms 2-8
2.3 Self-Regulating Clustering Algorithm (SRCA) 2-12
2.3.1 Cluster Boundary Estimation 2-12
2.3.2 Three Major Mechanisms—Growing, Merging and Splitting Mechanisms 2-16
2.3.2.1 Growing Mechanism 2-16
2.3.2.2 Merging Mechanism 2-17
2.3.2.3 Splitting Mechanism 2-27

3 Radial Basis Function Networks and Parameter Learning Algorithms 3-1
3.1 Radial Basis Function Networks 3-1
3.1.1 Conventional Radial Basis Function Network 3-2
3.1.2 Normalized Radial Basis Function Network 3-4
3.2 Parameter Learning Algorithms 3-5

4 Simulations 4-1
4.1 Benchmark Examples 4-1
4.1.1 Artificial Data 4-1
4.1.2 Iris Data 4-2
4.2 Function Approximation Using RBFNs 4-5
4.2.1 Example 1 4-5
4.2.2 Example 2 4-6
5 Conclusions and Future Works 5-1
References

LIST OF TABLES
Table Page
4.1 The information of the generated artificial data. 4-4
4.2 The estimated cluster distribution by the SRCA and the estimation errors. 4-4
4.3 The estimated Iris cluster distribution by the SRCA. 4-4
4.4 The errors between the actual and estimated cluster distribution. 4-5
4.5 The NRMSE of function approximation of three different cluster algorithms for Example 1. 4-7
4.6 The NRMSE of function approximation for three different cluster algorithms for Example 2. 4-7

LIST OF FIGURES
Figure Page
2.1 Hierarchical clustering algorithms. (a) Single linkage. (b) Complete linkage. 2-2
2.2 Flowchart of growing partitioning algorithm. 2-7
2.3 The flowchart of SRCA. 2-11
2.4 The dynamic trajectory of the regulating vector R. 2-14
2.5 Mapping consistency of the MCA. 2-20
2.6 Three clusters merge into two clusters with different measures. 2-20
2.7 Three different situations for the index . (a) Fix variations of and C; only consider the relation between and . (b) Fix variations of and; only consider the relation between and C. (c) Fix variations of and C; only consider the relation between and . 2-23
2.8 The surface of the index i,j with different C and of cluster j. 2-24
2.9 Cluster split-up: one cluster splits into two clusters. 2-28
3.1 One dimensional radial basis function. (a) The representation of the one dimensional RBF and its equation. (b) One dimensional Gaussian function. 3-2
3.2 Two dimensional radial basis function. (a) The representation of two dimensional RBF and its equation. (b) Two dimensional Gaussian function. 3-3
3.3 The architecture of the conventional radial basis function network. 3-4
3.4 The architecture of the normalized radial basis function network. 3-5
4.1 The results of artificial clusters. (a) The distribution of the input space. (b) The distribution of the output space 4-3
4.2 The estimated Iris clusters by the proposed SRCA. 4-3
4.3 Example 1 of function approximation. (a) The training samples and the training result. (b) The approximation result by the SRCA. 4-6
4.4 Example 2 of function approximation. (a) The original function surface. (b)The approximated result. 4-7
[1]M. R. Anderberg, Cluster analysis for applications, New York: Academic Press, 1973.

[2]D. Young and A. J. Gray, “Semi-automatic boundary detection for identification of cells in DIC microscope images,” Proc. of IEEE Sixth Int'l Conf. on Image Processing and Its Applications, vol. 1, pp. 246-250, July 1997.

[3]C. H. Li and P. C. Yuen, “Regularized color clustering in medical image database,” IEEE Trans. on Medical Imaging, vol. 19, no. 11, pp. 1150-1155, Nov. 2000.

[4]A. Baraldi and P. Blonda, “A survey of fuzzy clustering algorithms for pattern recognition-part I,” IEEE Trans. on Systems, Man and Cybernetics, vol. 29, no. 6, pp. 778-785, Dec. 1999.

[5]A. K. Jain and R. C. Dubes, Algorithms for clustering data, New York: Prentice Hall, 1988.

[6]J. B. McQueen, “Some methods for classification and analysis of multivariate observations,” Proc. of the Fifth Berkeley Symposium on Mathematical Statistics and Probability, vol. 1, pp. 281-297, 1967.

[7]E. H. Ruspini, “A new approach to clustering,” Information and Control, vol. 15, no. 1, pp. 22-32, July 1969.

[8]K. Krishna and M. Narasimha Murty, “Genetic k-means algorithm,” IEEE Trans. on Systems, Man and Cybernetics, Part B, vol. 29, no. 3, pp. 433-439, Dec. 1999.

[9]C. Chinrungrueng and C. H. Séquin, “Optimal adaptive k-means Algorithm with dynamic adjustment of learning rate,” IEEE Trans. on Neural Networks, vol. 6, no. 1, pp. 157-169, Jan. 1995.

[10]J. C. Bezdek, J. Keller, R. Krisnapuram and N. R. Pal, Fuzzy models and algorithms for pattern recognition and image processing, Boston: Kluwer Academic, 1999.

[11]T. A. Runkler and J. C. Bezdek, “Alternating cluster estimation: a new tool for clustering and function approximation,” IEEE Trans. on Fuzzy Systems, vol. 7, no. 4, pp. 377-393, Aug. 1999.

[12]D. E. Gustafson and W. C. Kessel, “Fuzzy clustering with a fuzzy covariance matrix,” Proc. of IEEE Conf. on Decision Control, San Diego, CA, pp. 761-766, 1979.

[13]U. Kaymak and M. Setnes, “Fuzzy clustering with volume prototypes and adaptive cluster merging,” IEEE Trans. on Fuzzy Systems, vol. 10, no. 6, pp. 705-712, Dec. 2002.

[14]A. M. Bensaid, L. O. Hall, J. C. Bezdek, L. P. Clarke, M. L. Silbiger, J. A. Arrington and R. F. Murtagh, “Validity-guided (re)clustering with applications to image segmentation,” IEEE Trans. on Fuzzy Systems, vol. 4, no. 2, pp. 112-123, May 1996.

[15]N. R. Pal and J. C. Bezdek, “On cluster validity for the fuzzy c-means model,” IEEE Trans. on Fuzzy Systems, vol. 3, no. 3, pp. 370-379, Aug. 1995.

[16]R. J. Hathaway and J. C. Bezdek, “Visual cluster validity for prototype generator clustering models,” Pattern Recognition Letters, vol. 24, no. 9-10, pp. 1573-1579, June 2003.

[17]X. L. Xie and G. Beni, “A validity measure for fuzzy clustering,” IEEE Trans. on Pattern Analysis and Machine Intelligence, vol. 13, no. 8, pp. 841-847, Aug. 1991.

[18]G. A. Carpenter, S. Grossberg and D. B. Rosen, “Fuzzy ART: fast stable learning and categorization of analog patterns by an adaptive resonance system,” Neural Networks, vol. 4, pp. 759-771, 1991.

[19]C. F. Juang and C. T. Lin, “An on-line self-constructing neural fuzzy inference network and its applications,” IEEE Trans. on Fuzzy Systems, vol. 6, no. 1, pp. 12-32, Feb. 1998.

[20]S. Abe, “Dynamic cluster generation for a fuzzy classifier with ellipsoidal regions,” IEEE Trans. on Systems, Man and Cybernetics, Part B, vol. 28, no. 6, pp. 869-876, Dec. 1998.

[21]Y. J. Zhang and Z. Q. Liu, “Self-splitting competitive learning: a new on-line clustering paradigm,” IEEE Trans. on Neural Networks, vol. 13, no. 2, pp. 369-380, March 2002.

[22]J. S. Wang and C. S. G. Lee, “Self-adaptive neuro-fuzzy inference systems for classification applications,” IEEE Trans. on Fuzzy Systems, vol. 10, no. 6, pp. 790-802, Dec. 2002.

[23]R. R. Yager and D. P. Filev, “Approximate clustering via the mountain method,” IEEE Trans. on Systems, Man and Cybernetics, vol. 24, no. 8, pp. 1279-1284, Aug. 1994.

[24]L. Kaufman and P. J. Rousseeuw, Finding groups in data: an introduction to cluster analysis, New York: Wiley, 1989.

[25]L. Acosta, G. N. Marichal, L. Moreno, J. A. Mendez and J. J. Rodrigo, “Obstacle avoidance using the human operator experience for a mobile robot,” Journal of Intelligent Robotic Syst., vol. 27, no. 4, pp. 305-319, 2000.

[26]S. G. Tzafestas and G. G. Rigatos, “Neural and neurofuzzy FELA adaptive robot control using feedforward and counterpropagation networks,” Journal of Intelligent Robotic Syst., vol. 23, pp. 291-330, 2000.

[27]S. L. Chiu, “Fuzzy model identification based on cluster estimation,” Journal of Intelligence and Fuzzy Systems, vol. 2, pp. 267-278, 1994.

[28]M. T. Musavi, W. Ahmed, K. H. Chen, K. B. Faris and D. M. Hummels, “On the training of radial basis function classifiers,” Neural Networks, vol. 5, pp. 595-603, 1992.

[29]L. Bruzzone and D. F. Prieto, “A technique for the selection of kernel-function parameters in RBF neural networks for classification of remote-sensing images,” IEEE Trans. on Geoscience and Remote Sensing, vol. 37, no. 2, pp. 1179-1184, March 1999.

[30]W. Pedrycz, “Conditional fuzzy clustering in the design of radial basis function neural networks,” IEEE Trans. on Neural Networks, vol. 9, no. 4, pp. 601-612, July 1998.

[31]Z. Uykan, C. Guzelis, M. E. Celebi and H. N. Koivo, “Analysis of input-output clustering for determining centers of RBFN,” IEEE Trans. on Neural Networks, vol. 11, no. 4, pp. 851-858, July 2000.

[32]R. Battiti, “First- and second-order methods for learning: between steepest descent and newton’s method,” Neural Comput., vol. 4, pp. 141-166, 1992.

[33]J. E. Dennis and R. B. Schnabel, Numerical methods for unconstrained optimization and nonlinear equations, Upper Saddle River, NJ: Prentice-Hall, 1983.

[34]J. González, I. Rojas, H. Pomares and J. Ortega, “A new clustering technique for function approximation,” IEEE Trans. on Neural Networks, vol. 13, no. 1, pp. 132-142, Jan. 2002.

[35]J. Alirezaie, M. E. Jernigan and C. Nahmias, “Neural network-based segmentation of magnetic resonance images of the brain,” IEEE Trans. on Nuclear Science, vol. 44, no. 2, pp. 194-198, April 1997.
連結至畢業學校之論文網頁點我開啟連結
註: 此連結為研究生畢業學校所提供,不一定有電子全文可供下載,若連結有誤,請點選上方之〝勘誤回報〞功能,我們會盡快修正,謝謝!
QRCODE
 
 
 
 
 
                                                                                                                                                                                                                                                                                                                                                                                                               
第一頁 上一頁 下一頁 最後一頁 top