跳到主要內容

臺灣博碩士論文加值系統

(44.220.62.183) 您好!臺灣時間:2024/03/01 16:50
字體大小: 字級放大   字級縮小   預設字形  
回查詢結果 :::

詳目顯示

: 
twitterline
研究生:林欣志
研究生(外文):Hsin-Chih Lin
論文名稱:模擬退火法應用於倒傳遞類神經網路之參數調整與屬性篩選
論文名稱(外文):Optimization of Back-Propagation Network and Feature Selection Using Simulated Annealing Approach
指導教授:林詩偉林詩偉引用關係
指導教授(外文):Shih-Wei Lin
學位類別:碩士
校院名稱:華梵大學
系所名稱:資訊管理學系碩士班
學門:電算機學門
學類:電算機一般學類
論文種類:學術論文
論文出版年:2006
畢業學年度:94
語文別:中文
論文頁數:49
中文關鍵詞:倒傳遞類神經網路模擬退火法最佳化屬性篩選
外文關鍵詞:Back-propagation networkSimulated AnnealingFeature selection
相關次數:
  • 被引用被引用:6
  • 點閱點閱:442
  • 評分評分:
  • 下載下載:110
  • 收藏至我的研究室書目清單書目收藏:0
倒傳遞類神經網路是一種常被使用的資料探勘的工具。然而,面臨不同的問題倒傳遞類神經網路所需的參數與架構也有所不同,常要讓使用者進行類似"試誤"的動作,多次實驗後才可決定,但這種方式可能會找到較差的參數與架構。另外,資料集中包含著許多屬性資料,但其中的屬性並非全部對於預測都有所幫助,所以,本研究提出使用模擬退火法尋找倒傳遞類神經網路中架構、參數並進行屬性篩選,挑選出對於預測正確率有助益之屬性。
為了評估所提出方法,本研究使用UCI(University of California, Irvine)機械學習資料庫中的資料集來評估效能,透過10-Fold Cross-validation的方式計算正確率,並與過去研究比較。只考慮參數與架構調整,本研究所提出之方法優於過去其他研究。同時考慮屬性篩選時,在大部分的資料集測試正確率皆提高。因此,本研究所提出的方法可以有效找到良好的網路架構與參數,並可以找出有幫助的屬性,提高分類正確率。
The back-propagation network (BPN) is a popular data mining technique. Nevertheless, different problems may require different network architectures and parameters. Therefore, rule of thumb or "try and error" methods are usually used to determine them. However, these methods may lead worse network architectures and parameters. On the other hand, a dataset may contain many features; however, not all features are beneficial for classification in BPN. Therefore, a simulated annealing (SA) approach is proposed to obtain the better network architectures and parameters, and to select the beneficial subset of features which result in a better classification.
In order to evaluate the proposed approach, the datasets in UCI Machine Learning Repository are used to evaluate the performance and the 10-fold cross-validation is applied to calculate the classification result. The experimental results show that the network architectures and parameters obtained by the proposed approach are better than those of other approaches. When the feature selection is taken into consideration, the classification accurate rates of most dataset are increased. Therefore, the developed approach can be utilized to find out the network architecture and parameters of BPN, and discover the useful features effectively.
誌謝 I
摘要 II
Abstract III
目錄 IV
表錄 V
圖錄 VI
ㄧ、緒論 1
1.1 研究背景與動機 1
1.2 研究目的 5
1.3 研究限制 5
1.4 研究流程 5
二、文獻探討 7
2.1倒傳遞類神經網路 7
2.1.1倒傳遞類神經網路架構 9
2.1.2倒傳遞類神經網路演算法 11
2.2倒傳遞類神經網路相關研究 13
2.3模擬退火法 16
2.4 Hide-and-Seek SA 18
2.5屬性篩選 20
2.5.1 Wrapper Model 21
2.5.2 Filter Model 22
2.6主成分分析 25
三、研究方法 27
3.1 編碼方式 27
3.2資料前處理 28
3.3決定SA迭代次數 30
3.4 SA+BPN研究步驟 35
四、實驗結果與分析 37
4.1 SA+BPN進行無屬性篩選之參數調整 37
4.2 SA+BPN進行有屬性篩選之參數調整 39
五、結論與後續研究 43
參考文獻 45
[1] 葉怡成(2003),類神經網路模式應用與實作,第5版,儒林出版社,台灣台北。
[2] 張振魁,「以類神經網路提高股票單日交易策略之獲利」,國立中央大學資訊管理研究所碩士,民國89年。
[3] 孫漢屏,「類神經網路為基礎之智慧型醫院網路掛號系統」,私立中國醫藥學院醫務管理研究所碩士,民國91年。
[4] 莊文仲,「多層感知等化器-使用進化演算法」,國立中央大學電機工程研究所碩士,民國90年。
[5] 李正彬,「一個新的神經網路學習法─蟻窩演算法」,元智大學工業工程與管理學系,民國91年。
[6] 俞慧華,「改良式類神經網路模式於信用卡顧客關係管理之研究」,國立台北科技大學商業自動化與管理研究所,民國91年。
[7] 葉松炫,「運用類神經網路預測匯率」,國立中山大學財務管理學系研究所碩士,民國89年。
[8] 林建廷,「類神經網路在無限通訊市場消費區隔與預測之應用研究」,私立元智大學管理研究所碩士,民國89年。
[9] 李修宇,「以資料萃取技術探索天氣、污染、氣喘病發作的關連性─以類神經BPN模型為例」,私立南華大學資訊管理學系碩士班碩士,民國90年。
[10] 高仲仁,「運用類神經網路進行隧道岩體分類」,國立中央大學應用地質研究所碩士,民國90年。
[11] 胡俊男,「應用類神經網路於半導體製程即時控制之研究」,私立元智大學工業工程與管理學系碩士,民國91年。
[12] 胡永國,「應用類神經網路推估地下水位洩降所致地層沉陷之研究」,國立屏東科技大學土木工程系碩士班碩士,民國91年。
[13] 潘文超,「以遺傳演化類神經網路建構e化預測系統-樂透與財務預警個案研究」,私立東吳大學經濟學系碩士,民國92年。
[14] 楊宗儒,「倒傳遞類神經網路於暴潮預測之研究」,私立立德管理學院資源環境研究所碩士,民國93年。
[15] 楊東翰,「整合基因演算法及類神經網路於印刷電路板生產預測之研究」,私立元智大學工業工程與管理學系碩士,民國93年。
[16] 郭彥良,「類神經網路應用於石門水庫放流量預測之研究」,私立中華大學土木工程學系碩士班碩士,民國93年。
[17] 周文賢 (2004),多變量統計分析,第2版,智勝文化事業有限公司,台灣台北。
[18] Berry, M. J. A. and Linoff, G., Data Mining Techniques: for Marking, Sales and Customer Support, John Wiley and Sons, 2001.
[19] Malhotra, R. and Malhotra, D. K., “Evaluating Consumer Loans Using Neural Networks,” The International Journal of Management Science, vol. 31, Jan. 2003, pp. 83-96.
[20] Lee, T. -S., Chiu, C. -C., Chou, Y. -C. and Lu, C. -J., “Mining the Customer Credit Using Classification and Regression Tree and Multivariate Adaptive Regression Splines,” Computational Statistics and Data Analysis, vol. 50, 2006, pp. 1113-1130.
[21] Zhang, Q. -J., Gupta, K. C. and Devabhaktuni, V. K., “Artificial Neural Networks for RF and Microwave Design-from Theory to Practice,” IEEE Transactions on Microwave Theory and Techniques, vol. 51, 2003, pp. 1339-1350.
[22] Visen, N. S., Paliwal, J., Jayas, D. S. and White, N. D. G., “Specialist Neural Networks for Cereal Grain Classification,” Biosystems Engineering, vol. 82, June. 2002, pp. 151-159.
[23] El-Din, A. G. and Smith, D. W., “A Neural Network Model to Predict the Wastewater Inflow Incorporating Rainfall Events,” Water Research, vol. 36, 2002, pp. 1115-1126.
[24] Gueli, N. et al., “The Influence of Lifestyle on Cardiovascular Risk Factors Analysis Using a Neural Network,” Archives of Gerontology and Geriatrics, vol. 40, 2005, pp. 157-172.
[25] Subasi, A., “Automatic Recognition of Alertness Level from EEG by Using Neural Network and Wavelet Coefficients,” Expert Systems with Applications, vol. 28, 2005, pp. 701-711.
[26] Bogdanov, A. V., Sandven, S., Johannessen, O. M., Alexandrov, V. Y. and Bobylev, L. P., “Multisensor Approach to Automated Classification of Sea Ice Image Data,” IEEE Transactions on Geosciences and Remote Sensing, vol. 43, 2005, pp. 1648-1664.
[27] Neaupane, K. M. and Achet, S. H., “Use of Backpropagation Neural Network for Landslide Monitoring: a Case Study in the Higher Himalaya,” Engineering Geology, vol. 74, 2004, pp. 213-226.
[28] Han, J. and Kamber, M., Data Mining: Concepts and Techniques, Morgan Kaufmann, San Francisoc, 2003.
[29] Matsunaga, A. and Ogawa, K., “Scatter Correction in Multinuclide Data Acquisition by Means of a Neural Network,” Proceedings of IEEE on Nuclear Science Symposium, vol. 2, 1999, pp. 948-952.
[30] Sexton, R. S., Alidaee, B., Dorsey, R. E. and Johnson, J. D., “Global Optimization for Artificial Neural Networks: a Tabu Search Application,” European Journal of Operational Research, vol. 106, 1998, pp. 570-584.
[31] Gupta, J. N. D. and Sexton, R. S., “Comparing Backpropagation with a Genetic Algorithm for Neural Network Training,” The International Journal of Management Science, vol. 27, 1999, pp. 679-684.
[32] Ghosh, R. and Verma, B., “A Hierarchical Method for Finding Optimal Architecture and Weights Using Evolutionary Least Square Based Learning,” International Journal of Neural Systems, vol. 13, 2003, pp. 13-24.
[33] Khaw, J. F. C., Lim, B. S. and Lim, L. E. N., “Optimal Design of Neural Networks Using the Taguchi Method,” Neurocomputing, vol. 7, 1995, pp. 225-245.
[34] Castillo, P. A., Merolo, J. J., Prieto, A., Rivas, V. and Romero, G., “G-Prop: Global Optimization of Multilayer Perceptrons Using GAs,” Neurocomputing, vol. 35, 2000, pp. 149-163.
[35] Castillo, P. A., Carpio, J., Merelo, J. J., Prieto, A., Rivas, V. and Romero, G., “Evolving Multilayer Perceptrons,” Neural Processing Letters, vol. 12, 2000, pp. 115-127.
[36] Wang, T -Y and Huang C -Y, “Applying Optimized BPN to a Chaotic Time Series Problem,” Expert Systems with Applications, 2006. (in press)
[37] Yeung, D. S. and Zeng, X. -Q., “Hidden Neuron Pruning for Multilayer Perceptrons Using a Sensitivity Measure,” Proceedings of IEEE on Machine Learning and Cybernetics, vol. 4, 2002, pp. 1751-1757.
[38] Yamazaki, A., de Souto, M. C. P. and Luderimir, T. B., “Optimization of Neural Network Weights and Architectures for Odor Recognition Using Simulated Annealing,” Proceedings of IEEE on Neural Networks, vol. 1, 2002, pp. 547-552.
[39] Lezoray, O. and Cardot, H., “A Neural Network Architecture for Data Classification,” International Journal of Neural Systems, vol. 11, 2001, pp. 33-42.
[40] Yang, J. and Honavar, V., “Feature Subset Selection Using a Genetic Algorithm,” IEEE Intelligent Systems, vol. 34, 1998, pp. 44-49.
[41] Abe, N. and Kudo, M., “Non-Parametric Classifier-Independent Feature Selection,” Pattern Recognition, to be published.
[42] Kocur, C. M. et al., “Using Neural Networks to Select Wavelet Features for Breast Cancer Diagnosis,” IEEE Engineering in Medicine and Biology, vol. 15, 1996, pp. 95-102.
[43] Kim, K. -J. and Han, I., “Genetic Algorithms Approach to Feature Discretization in Artificial Neural Networks for the Prediction of Stock Price Index,” Expert System with Application, vol. 19, 2000, pp. 125-132.
[44] Zhang, L., Jack, L. B. and Nandi, A. K., “Fault Detection Using Genetic Programming,” Mechanical Systems and Signal Processing, vol. 19, 2005, pp. 271-289.
[45] Jiang, W., Er. G.., Dai, Q. and Gu, J., “Similarity-Based Online Feature Selection in Content-Based Image Retrieval,” IEEE Transactions on Image Processing, vol. 15, 2006, pp. 702-712.
[46] Sexton, R. S., Dorsey, R. E. and Johnson, J. D., “ Optimization of Neural Networks: A Comparative Analysis of the Genetic Algorithm and Simulated Annealing,” European Journal of Operational Research, vol. 114, 1999, pp. 589-601.
[47] Romeijn, H. E. and Smith, R. L., “Simulated Annealing Constrained Global Optimization,” Journal of Global Optimization, vol. 5, 1994, pp. 101-126.
[48] Romeijn, H. E., Zabinsky, Z. B., Graesser, D. L. and Neogi, S., “New Reflection Generator for Simulated Annealing in Mixed-Integer/Continuous Global Optimization,” Journal of Optimization Theory and Applications, vol. 101, 1999, pp. 403-427.
[49] Blake, C. L. and Merz, C. J., (1998). UCI repository of machine information and computer sciences,Available: http://www.ics.uci.edu/~mlearn/MLRepository.html.
[50] Salzberg, S. L., “On Comparing Classifiers: Pitfalls to Avoid and a Recommended Approach,” Data Mining and Knowledge Discovery, vol. 1, 1997, pp. 317-328.
[51] Metropolis, N., Rosenbluth, A. W., Rosenbluth, M. N., Teller, A. H. and Teller, E., “Equation of State Calculations by Fast Computing Machines,” The Journal of Chemical Physics, vol. 21, 1953, pp. 1087-1092.
[52] Kirkpatrick, S., Gelatt, C. D., Jr. and Vecchi, M. P., “Optimization by Simulated Annealing,” Science, vol. 220, 1983, pp. 671-680.
[53] Reed, R., “Pruning Algorithms - a Survey,” IEEE Transactions on Neural Networks, vol. 4, 1993, pp. 740-747.
[54] Liu, H. and Motoda, H., Feature Selection for Knowledge Discovery and Data Mining, Kluwer Academic, Boston, 1998.
[55] Sexton, R. S., McMurtrey, S. and Cleavenger, D. J., “Knowledge Discovery Using a Neural Network Simultaneous Optimization Algorithm on a Real World Classification Problem,” European Journal of Operational Research, vol. 168, 2006, pp. 1009-1018.
[56] Kohavi, R. and John, G. H., “Wrappers for Feature Subset Selection,” Artificial Intelligence, vol. 97, 1997, pp. 237-324.
[57] Pudil, P., Novovičová, J. and Kittler, J., “Floating Search Methods in Feature Selection,” Pattern Recognition Letters, vol. 15, 1994, pp. 1119-1125.
[58] Verikas, A. and Bacauskiene, M., “Feature Selection with Neural Networks,” Pattern Recognition Letters, vol. 23, 2002, pp. 1323-1335.
[59] Sivagaminathan, R. K. and Ramakrishnan, S., “A Hybrid Approach for Feature Subset Selection Using Neural Networks and Ant Colony Optimization,” Expert Systems with Applications, 2006(in press).
QRCODE
 
 
 
 
 
                                                                                                                                                                                                                                                                                                                                                                                                               
第一頁 上一頁 下一頁 最後一頁 top
無相關期刊