(3.234.210.89) 您好!臺灣時間:2019/11/23 08:00
字體大小: 字級放大   字級縮小   預設字形  
回查詢結果

詳目顯示:::

我願授權國圖
本論文永久網址: 
line
研究生:林忠億
研究生(外文):Jung Yi Lin
論文名稱:多層式多重族群基因規劃法與其應用
論文名稱(外文):Layered Multi-Population Genetic Programming And Its Applications
指導教授:楊維邦楊維邦引用關係錢炳全
指導教授(外文):Wei-Pang YangBeen-Chian Chien
學位類別:博士
校院名稱:國立交通大學
系所名稱:資訊科學與工程研究所
學門:工程學門
學類:電資工程學類
論文出版年:2007
畢業學年度:95
語文別:英文
論文頁數:90
中文關鍵詞:基因規劃法多族群基因規劃法分類問題特徵選擇特徵產生演化式計算
外文關鍵詞:Genetic programmingmulti-population genetic programmingclassificationclassifier designfeature selectionfeature constructionevolutionary computation
相關次數:
  • 被引用被引用:0
  • 點閱點閱:349
  • 評分評分:系統版面圖檔系統版面圖檔系統版面圖檔系統版面圖檔系統版面圖檔
  • 下載下載:43
  • 收藏至我的研究室書目清單書目收藏:0
基因規劃法是屬於演化式計算的一種機器學習方法。其利用模擬生物界之演化機制,以「適者生存」的概念求取滿足給定條件之最佳解。
如何改進基因規劃法的效率,一直是個熱門的研究方向。

分類問題在知識工程中是一個很重要的問題。大部份的分類問題是不能由人力知識進行解決的,因此,如何從資料中找出分類的依據,
是許多機器學習方法被提出的動機。特徵選擇與特徵產生是兩個處理特徵的研究領域,經由對特徵進行適當的處理,在解決分類問題時,
可以提升處理效率與分類準確率。

本篇論文將合併基因規劃法、特徵選擇與特徵產生等三種研究方向,並提出可解決分類問題的多層式多族群基因演算法架構。
傳統的基因規劃法採用單族群機制,
由此族群進行演化模擬而得出最佳解。我們將延伸單族群基因規劃法至多族群基因規劃法,並且提出一種多層式的架構將族群進行整合。
每一層將使用多個族群進行演化模擬,並於下一層進行整合與再演化。此多層式架構不僅可利用多個族群來求得更好的解,
更利用多層式架構將解進行改善與調整。經由實驗,我們將證明此方法具有高度準確性與高效率。另外,為了提高每一族群之學習表現,
我們也提出一個依據平均適應度與剩餘演化世代數之動態突變調整方法。為了解決多類別分類問題,
我們提出一個基於統計理論的解決機制,讓基因規劃法不僅適用於多類別分類問題,更能提高分類準確率。
應用此架構,我們也提出一種融合特徵選擇與特徵產生的方法,並以實驗證明此方法之分類準確率與特徵處理效果。
This study focuses on a proposed method based on genetic programming (GP). Genetic
programming is a prominent technique of evolutionary computation (EC). It mimics
the evolution mechanism of biological environment to determine optimal solutions for
given training instances. Many researchers have been devoted to enhance effectiveness
and efficiency of genetic programming.
The applications of the proposed method include classification and feature processing.
Classification problems play an important role in the development of knowledge
engineering. Hidden relations that can be used as a basis for classification are often unclear
and not easily elucidated. Thus, many machine learning algorithms have arisen
to solve such problems. Feature selection and feature generation are two important
techniques dealing with features. Feature selection is capable of removing useless,
irrelevant, redundant, and noisy features. Feature generation generates new useful
features that could improve classification accuracy.
In this study we propose a layered multi-population genetic programming method
to solve classification problems. The proposed method that can complete feature selection
and feature construction simultaneously is also proposed. The layered multipopulation
genetic programming method employs layer architecture to arrange multiple
populations. A layer is composed of a number of populations. Each population
evolves to generate a discriminant function. A set of discriminant functions generated
by one layer will be integrated and be transformed by the successive layer. To
improve the learning performance, an adaptive mutation probability tuning method
is proposed. Moreover, a statistical-based method is proposed to solve multi-category
classification problems.
Several experiments on classical classification problems and real-world medical
problems are conducted using different configurations. Experimental results show that
the proposed methods are accurate and effective.
中文摘要 i
Abstract in English ii
致謝 iii
1 Introduction 1
1.1 Genetic Programming . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 1
1.2 Classification . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 3
1.3 Feature Selection and Feature Construction . . . . . . . . . . . . . . . . . 3
1.4 Motivation and Contribution . . . . . . . . . . . . . . . . . . . . . . . . . 4
1.5 Overview of Chapters . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 4
2 Literature Review 6
2.1 Genetic Programming . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 6
2.1.1 Single Population Genetic Programming . . . . . . . . . . . . . . 6
2.1.1.1 The population and individuals . . . . . . . . . . . . . . 7
2.1.1.2 The fitness function . . . . . . . . . . . . . . . . . . . . . 8
2.1.1.3 Generations and selection methods . . . . . . . . . . . . 8
2.1.2 Multi-Population Genetic Programming . . . . . . . . . . . . . . . 11
2.2 Classification Algorithms . . . . . . . . . . . . . . . . . . . . . . . . . . . 13
2.2.1 Classifiers . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 13
2.2.2 A Brief Review of Classifiers . . . . . . . . . . . . . . . . . . . . . 15
2.3 Genetic Programming and Classification . . . . . . . . . . . . . . . . . . . 21
2.4 Feature Selection and Feature Construction . . . . . . . . . . . . . . . . . 22
2.5 Genetic Programming, Feature Selection and Feature Construction . . . 25
2.6 Research Questions . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 25
3 Research Design 29
3.1 Introduction . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 29
3.2 LAGEP: Evolving a Population . . . . . . . . . . . . . . . . . . . . . . . . 29
3.2.1 Individual Definitions . . . . . . . . . . . . . . . . . . . . . . . . . 30
3.2.2 Fitness Function . . . . . . . . . . . . . . . . . . . . . . . . . . . . 31
3.2.3 Validation . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 32
3.2.4 Elitism Evolution Strategy and the Evolutionary Flowchart . . . 32
3.2.5 AMPT: Adaptive Mutation Probability Tuning . . . . . . . . . . . 33
3.3 LAGEP: Evolving Layers . . . . . . . . . . . . . . . . . . . . . . . . . . . . 35
3.3.1 Layered Architecture . . . . . . . . . . . . . . . . . . . . . . . . . . 35
3.3.2 Advantages of Using LAGEP . . . . . . . . . . . . . . . . . . . . . 38
3.3.3 The Testing Phase and Z-value measure method, ZM . . . . . . . 40
3.3.4 LAGEP: An Example . . . . . . . . . . . . . . . . . . . . . . . . . . 41
3.4 LAGEP-FS: LAGEP with Feature Construction and Feature Selection . . 44
3.4.1 Architecture . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 45
3.4.2 Proposed Feature Selection Methods . . . . . . . . . . . . . . . . . 47
3.4.3 An Example . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 50
4 Experiment Study 53
4.1 Introduction . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 53
4.2 Medical Classification Problems . . . . . . . . . . . . . . . . . . . . . . . . 53
4.3 Hypotheses . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 55
4.4 Experiments of AMPT method . . . . . . . . . . . . . . . . . . . . . . . . 56
4.5 LAGEP Experiments . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 57
4.5.1 Experimental Settings . . . . . . . . . . . . . . . . . . . . . . . . . 57
4.5.2 Analysis and Discussion . . . . . . . . . . . . . . . . . . . . . . . . 60
4.5.2.1 Comparing classification accuracy between SGP, LAGEP
and four cited methods . . . . . . . . . . . . . . . . . . . 60
4.5.2.2 Comparing classification accuracy between LAGEP and
ES1 . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 61
4.5.2.3 Comparing elapsed training time . . . . . . . . . . . . . 62
4.5.2.4 Comparing classification accuracy between LAGEP settings
. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 65
4.5.2.5 The improvement of score values . . . . . . . . . . . . . 67
4.6 LAGEP-FS Experiments . . . . . . . . . . . . . . . . . . . . . . . . . . . . 70
4.6.1 Experimental Settings . . . . . . . . . . . . . . . . . . . . . . . . . 71
4.6.2 Analysis and Discussion . . . . . . . . . . . . . . . . . . . . . . . . 71
4.6.2.1 Analyzing ES1 . . . . . . . . . . . . . . . . . . . . . . . . 72
4.6.2.2 Analyzing two-layer LAGEP-FS settings . . . . . . . . . 76
4.6.2.3 Analyzing three-layer LAGEP-FS settings . . . . . . . . 76
4.6.2.4 Analyzing LAGEP-FS settings that have the same number
of populations . . . . . . . . . . . . . . . . . . . . . . 76
4.6.2.5 Analyzing ES9 and ES10 . . . . . . . . . . . . . . . . . . 76
4.6.2.6 The effectiveness of new features . . . . . . . . . . . . . 77
5 Conclusion and FutureWork 80
Bibliography 82
Vita 91
[1] A. Ahmad and L. Dey, “A feature selection technique for classificatory analysis,”
Pattern Recognition Letters, vol. 26, pp. 43–56, 2005.
[2] U. Alon, N. Barkai, D. A. Notterman, K. Gish, S. Ybarra, D. Mack, and A. J. Levine,
“Broad patterns of gene expression revealed by clustering analysis of tumor and
normal colon tissues probed by oligonucleotide arrays,” in Proceedings of the National
Academy of Sciences of the United States of America, vol. 96, pp. 6745–6750,
1999.
[3] David Andre and John R. Koza, “Parallel genetic programming: a scalable implementation
using the transputer network architecture,” pp. 317–337, 1996.
[4] W. Banzhaf, P. Nordin, R. E.Keller, and F. D. Framcone, Genetic Programming: An
Introduction on the Automatic Evolution of Computer Programs and Its Application, San
Francisco, CA: Morgan Kaufmann, 1998.
[5] C. C. Bojarczuk, H. S. Lopes, and A. A. Freitas, “Discovering comprehensible
classification rules using genetic programming: a case study in a medical domain,”
in Proceedings of Genetic and Evolutionary Computation Conference (GECCO-
99), pp. 953–958, Orlando, FL, USA, 1999.
[6] M. Brameier and W. Banzhaf, “A comparison of linear genetic programming and
neural networks in medical data mining,” IEEE Transactions on Evolutionary Computation,
vol. 5, pp. 17–26, February 2001.
[7] Markus Brameier, Frank Hoffmann, Peter Nordin, Wolfgang Banzhaf, and Frank
Francone, “Parallel machine code genetic programming,” in Proceedings of the Genetic
and Evolutionary Computation Conference (Wolfgang Banzhaf, Jason Daida,
Agoston E. Eiben, Max H. Garzon, Vasant Honavar, Mark Jakiela, and Robert E.
Smith, eds.), vol. 2, p. 1228, Orlando, Florida, USA, 13-17 July 1999. Morgan Kaufmann.
[8] Kuo-Hsiu Chen, Hong-Ling Chen, and Hahn-Ming Lee, “A multiclass neural network
classifier with fuzzy teaching inputs,” Fuzzy Sets and Systems, vol. 91, pp. 15–
35, October 1997.
[9] Corinna Cortes and Vladimir Vapnik, “Support-vector networks,” Machine Learning,
vol. 20, no. 3, pp. 273–297, 1995.
[10] T. M. Cover and P. E. Hart, “Nearest neighbor pattern classification,” IEEE Transactions
on Information Theory, vol. 13, no. 1, pp. 21–27, 1967.
[11] G. Cybenko, “Approximation by superpositions of a sigmoidal function,” Mathematics
of Controls, Signals, and Systems, vol. 2, pp. 303–314, 1989.
[12] M. Dash and H. Liu, “Feature selection for classification,” Intelligent Data Analysis,
vol. 1, no. 3, pp. 131–156, 1997.
[13] C.L. Blake D.J. Newman, S. Hettich and C.J. Merz. “UCI
repository of machine learning databases,”. online source:
http://www.ics.uci.edu/∼mlearn/MLRepository.html, 1998.
[14] P. Domingos and M. Pazzani, “Beyond independence: Conditions for the optimality
of the simple bayesian classifier,” in Proceedings of the 13th International Conference
on Machine Learning, pp. 105–112, 1996.
[15] R. O. Duda and P. E. Hart, Pattern classification and scene analysis, New York, USA:
Wiley, 1973.
[16] F. E. B. F. E. B. Otero, M. M. S. Silva, A. A. Freitas, and J. C. Nievola, “Genetic programming
for attribute construction in data mining,” in Proceedings of 6th European
Conference on Genetic Programming, pp. 384–393, Essex, UK, 2003.
[17] I. De Falco, A. Della Cioppa, and E. Tarantino, “Discovering interesting classification
rules with genetic programming,” Applied Soft Computing, vol. 23, pp. 1–13,
May 2002.
[18] F. Fern´andez, M. Tomassini, and J. M. Sanchez, “Experimental study of isolated
multipopulation genetic programming,” in IEEE International Conference on Industrial
Electronics, Control and Instrumentation, p. 2672􀀀2677, Nagoya, Japan, 2000.
IEEE Press, Piscataway, NJ.
[19] F. Fern´andez, M. Tomassini, and L. Vanneschi, “An empirical study of multipopulation
genetic programming,” Genetic Programming and Evolvable Machines, vol. 4,
pp. 21–51, May 2003.
[20] David B. Fogel, Evolutionary computation: toward a new philosophy of machine intelligence,
NJ: IEEE Press, 1995.
[21] A. A. Freitas, “A genetic programming framework for two data mining tasks:
classification and generalized rule induction,” in Proceedings of 2nd annual conference
on Genetic Programming, pp. 96–101, Stanford University, CA, USA, July 1997.
Morgan Kaufmann.
[22] M. Fuchs, “Large population are not always the best choice in genetic programming,”
in Proceedings of the Genetic and Evolutionary Computation Conference
GECCO’99 (W. Banzhaf, J. Daida, A. E. Eiben, M. Garzon, V. Honavar, M. Jakiela,
and R. Smith, eds.), pp. 1033–1038, San Francisco, CA, 1999. Morgan Kaufmann.
[23] C. Gathercole and P. Ross, “Small populations over many generations can beat
large populations over few generations in genetic programming,” in Genetic Programming
1997: Proceedings of the second annual conference (J. R. Koza, K. Deb,M. Dorigo, D. B. Fogel, M. Garzon, H. Iba, and R. L. Riolo, eds.), pp. 111–118,
San Francisco, CA, 1997. Morgan Kaufmann.
[24] J. Han and M. Kamber, Data mining: concepts and techniques, San Mateo, CA, USA:
Morgan Kaufmann, 2001.
[25] Y. Hayashi, M. Sakata, and S. I. Gallant, “Multi-layer versus single-layer neural
networks and a application to reading hand-stamped characters,” in Proceedings
of International Conference on Neural Networks, pp. 781–784, Paris, France, 1990.
[26] M. A. Hearst, B. Scholkopf, S. Dumais, E. Osuna, and J. Platt, “Trends and controversies
- support vector machines,” IEEE Intelligent Systems, vol. 13, no. 4, pp. 18–
28, 1998.
[27] D. Heckerman and M. P. Wellman, “Bayesian networks,” Communications of the
ACM, vol. 38, no. 3, pp. 27–30, 1995.
[28] M. I. Heywood and A. N. Zincir-Heywood, “Dynamic page based crossover in
linear genetic programming,” IEEE Transactions on System, Man, and Cybernetics -
Part B: Cybernetics, vol. 32, pp. 380–388, June 2002.
[29] G. Hong, L. B. Jack, and A. K. Nandi, “Feature generation using genetic programming
with application to fault classification,” IEEE Transactions on Systems, Man,
and Cybernetics - Part B: Cybernetics, vol. 35, no. 1, pp. 89–99, 2005.
[30] H. Morton I. Aleksander, An Introduction to Neural Computing, London, UK: Chapman
and Hall, 1990.
[31] A. K. Jain and D. Zongker, “Feature selection: evaluation, application, and small
sample performance,” IEEE Trans. on Pattern Analysis and Machine Intelligence,
vol. 19, no. 2, pp. 153–158, 1997.
[32] G. H. John, R. Kohavi, and K. Pfleger, “Irrelevant features and the subset selection
problem,” in Proceedings of 11th International Conference on Machine Learning,
pp. 121–129, New Brunswick, NJ, USA, 1994.
[33] I. T. Jolliffe, Principal Component Analysis, New York, USA: Springer, 1986.
[34] V. Kecman, Learning and soft computing: support vector machines, neural networks,
and fuzzy logic models, Cambridge, MA: MIT Press, 2001.
[35] J. K. Kishore, L. M. Patnaik, V. Mani, and V. K. Agrawal, “Application of genetic
programming for multicategory pattern classification,” IEEE Transactions on Evolutionary
Computation, vol. 4, pp. 242–258, September 2000.
[36] J. Kittler, “Feature set search algorithms,” in Pattern Recognition and Signal Processing
(C. H. Chen, ed.), pp. 41–60, Netherlands, 1978. Sijthoff and Noordhoff.
[37] R. Kohavi and G. H. John, “Wrappers for feature subset selection,” Artificial Intelligence,
vol. 97, pp. 273–324, 1997.
[38] A. Konstam, “Group classification using a mix of genetic programming and genetic
algorithms,” in Proceedings of the 1998 ACM Symposium of Applied computing,
pp. 308–312, Atlanta, Georgia, USA, February 27 - March 1 1998.
[39] M. Kotani, S. Ozawa, M. Nakai, and K. Akazawa, “Emergence of feature extraction
function using genetic programming,” in Proceedings of Third International
Conference on Knowledge-Based Intelligent Information Engineering System, pp. 149–
152, 1999.
[40] J. R. Koza, Genetic Programming: On the Programming of Computers by Means of
Natural Selection, Cambridge, MA: MIT Press, 1992.
[41] J. R. Koza, Genetic programming II, Cambridge, MA: MIT Press, 1994.
[42] J. R. Koza, D. E. Goldberg, and D. B. Fogel, eds., Genetic programming 1996, Cambridge,
MA: MIT Press, 1996.
[43] J. R. Koza, Forrest H. Bennett III, Forrest H. Bennett, David Andre, and Martin A.
Keane, Genetic programming III: Automatic programming and automatic circuit synthesis,
San Mateo, CA: Morgan Kaufmann, 1999.
[44] John R. Koza, Martin A. Keane, Matthew J. Streeter,William Mydlowec, Jessen Yu,
, and Guido Lanza, Genetic Programming IV: Routine Human-Competitive Machine
Intelligence, Norwell, MA, USA: Kluwer Academic Publishers, 2003.
[45] M. Kudo and J. Sklansky, “Comparison of algorithms that select features for pattern
classifiers,” Pattern Recognition, vol. 33, pp. 25–41, 2000.
[46] S. R. Kulkarni, G. Lugosi, and S. S. Venkatesh, “Learning pattern classification -
a survey,” IEEE Transactions on Information Theory, vol. 44, pp. 2178–2206, October
1998.
[47] A. Rogers L. Kallel, B. Naudts, ed., Theoretical aspects of evolutionary computing,
Germany: Springer-Verlag, 2000.
[48] C. Lee and D. A. Landgrebe, “Decision boundary feature extraction for neural
networks,” IEEE Transactions on Neural Networks, vol. 8, no. 1, pp. 75–83, 1997.
[49] H. M. Lee, “A neural network classifier with disjunctive fuzzy information,” Neural
Networks, vol. 11, pp. 1113–1125, August 1998.
[50] J. Y. Lin. “Lagep project,”. online resource:
http://www.cis.nctu.edu.tw/˜gis91815/lagep/lagep.html, 2007.
[51] T. Loveard and V. Ciesielski, “Representing classification problems in genetic
programming,” in Proceedings of the 2001 Congress on Evolutionary Computation,
pp. 1070–1077, May 27-30 2001.
[52] J. Ma, J. Theiler, and S. Perkins, “Two realizations of a general feature extraction
framework,” Pattern Recognition, vol. 37, pp. 875–887, 2004.
[53] J. Mao and A. K. Jain, “Artificial neural networks for feature extraction and multivariate
data projection,” IEEE Transactions on Neural Networks, vol. 6, no. 2,
pp. 296–317, 1995.
[54] J. Mercer, “Functions of positive and negative type and their connection with the
theory of integral equations,” Philosophical Transactions of the Royal Society, London.
[55] Brad L. Miller and David E. Goldberg, “Genetic algorithms, tournament selection,
and the effects of noise,” Complex Systems, vol. 9, pp. 193–212, 1995.
[56] M. Mitchell, An introduction to genetic algorithms, Cambridge, MA, USA: MIT Press,
1996.
[57] T. M. Mitchell, Machine learning, New York, USA: McGraw-Hill, 1997.
[58] D. P. Muni, N. R. Pal, and J. Das, “A novel approach to design classifiers using
genetic programming,” IEEE Transactions on Evolutionary Computation, vol. 8,
pp. 183–196, April 2004.
[59] D. P. Muni, N. R. Pal, and J. Das, “Genetic programming for simultaneous feature
selection and classifier design,” IEEE Transactions on System, Man, and Cybernetics
- Part B: Cybernetics, vol. 36, no. 1, pp. 106–117, 2006.
[60] D. Nauck and R. Kruse, “A neuro-fuzzy method to learn fuzzy classification rules
from data,” Fuzzy Sets and Systems, vol. 89, no. 3, pp. 277–288, 1997.
[61] I. S. Oh, J. S. Lee, and B. R. Moon, “Hybrid genetic algorithms for feature selection,”
IEEE Transactions on Pattern Analysis and Machine Intelligence, vol. 26, no. 11,
pp. 1424–1437, 2004.
[62] Mouloud Oussaidene, Bastien Chopard, and Olivier V. Pictet, “Parallel genetic
programming and its application to trading model induction,” Parallel Computing,
vol. 23, no. 8, pp. 1183–1198, 1997.
[63] C. H. Park and H. Park, “Nonlinear feature extraction based on centroids and
kernel functions,” Pattern Recognition, vol. 37, pp. 801–810, 2004.
[64] F. Pernkopf, “Bayesian network classifiers versus selective k-nn classifier,” Pattern
Recognition, vol. 38, pp. 1–10, 2005.
[65] L. Prechelt. “Proben1-a set of neural network benchmark problems and benchmarking
rules,”. Tech. Rep. 21/94, University at Karlsruhe, Karlsruhe, Germany,1994.
[66] P. Pudil, J. Novovicova, and J. Kitter, “Floating search methods in feature selection,”
Pattern Recognition Letters, vol. 15, pp. 1119–1125, 1994.
[67] Herbrich Ralf, Learning Kernel Classifiers: Theory and Algorithms, Cambridge, MA,
USA: MIT Press, 2001.
[68] M. L. Raymer, W. F. Punch, E. D. Goodman, L. A. Kuhn, and A. K. Jain, “Dimensionality
reduction using genetic algorithms,” IEEE Transactions on Evolutionary
Computation, vol. 4, pp. 164–171, 2000.
[69] M. M. Rizki, M. A. Zmuda, and L. A. Tamburino, “Evolving pattern recognition
systems,” IEEE Transactions on Evolutionary Computation, vol. 6, no. 6, pp. 594–609,
2002.
[70] B. Sch¨olkopf and A. J. Smola, Learninig with Kernels, Cambridge, MA, USA: MIT
Press, 2002.
[71] J. Sherrah, R. E. Bogner, and A. Bouzerdoum, “Automatic selection of features for
classification using genetic programming,” in Proceedings of Australian and New
Zealand conference on Intelligent information systems, pp. 284–287, Adelaide, SA,
Australia, November 18-20 1996.
[72] W. Siedlecki and J. Sklansky, “A note on genetic algorithms for large-scale feature
selection,” Pattern Recognition Letters, vol. 10, pp. 335–347, 1989.
[73] P. K. Simpson, “Fuzzy min-max neural networks-part 1: Classification,” IEEE
Transaction on Neural Networks, vol. 3, pp. 776–786, October 1992.
[74] M. G. Smith and L. Bull, “Using genetic programming for feature creation with a
genetic algorithm feature selector,” in Proceedings of 8th International Conference on
Parallel Problem Solving from Nature, pp. 1163–1171, Birmingham, UK, 2004.
[75] A. Tsakonas, “A comparison of classification accuracy of four genetic
programming-evolved intelligent structures,” Information Sciences, vol. 176,
pp. 691–724, 2006.
[76] V. Vapnik, The nature of statistical learning theory, New York, NY, USA: Springer-
Verlag, 1995.
[77] X. Wang and K. K. Paliwal, “Feature extraction and dimensionality reduction algorithms
and their applications in vowel recognition,” Pattern Recognition, vol. 36,
pp. 2429–2439, 2003.
[78] G. Wilson and M. I. Heywood, “Crossover context in page-based linear genetic
programming,” in Proceedings of IEEE Canadian Conference on Electrical and Computer
Engineering, pp. 809–814. IEEE Press, May 12-15 2002.
[79] G. P. Zhang, “Neural networks for classification: a survey,” IEEE Transaction on
Systems, Man, And Cybernetics-Part C: Applications and Reviews, vol. 30, no. 4,
pp. 451–462, 2000.
QRCODE
 
 
 
 
 
                                                                                                                                                                                                                                                                                                                                                                                                               
第一頁 上一頁 下一頁 最後一頁 top
系統版面圖檔 系統版面圖檔