跳到主要內容

臺灣博碩士論文加值系統

(44.201.92.114) 您好!臺灣時間:2023/03/31 12:31
字體大小: 字級放大   字級縮小   預設字形  
回查詢結果 :::

詳目顯示

我願授權國圖
: 
twitterline
研究生:林煒清
論文名稱:基於均值移動之自適應共變異數矩陣演化策略
論文名稱(外文):Mean Shift based Evolution Strategy with Covariance Matrix Adaption
指導教授:林昇甫林昇甫引用關係
學位類別:碩士
校院名稱:國立交通大學
系所名稱:電控工程研究所
學門:工程學門
學類:電資工程學類
論文種類:學術論文
論文出版年:2011
畢業學年度:99
語文別:中文
論文頁數:65
中文關鍵詞:自適應共變異數矩陣演化策略均值移動最佳化
外文關鍵詞:Evolution Strategy with Covariance Matrix Adaptionmean shiftoptimization
相關次數:
  • 被引用被引用:1
  • 點閱點閱:306
  • 評分評分:
  • 下載下載:0
  • 收藏至我的研究室書目清單書目收藏:0
本論文將一個基於均值移動(mean shift)的計算模型,導入自適應共變異數矩陣演化策略(covariance matrix adaptation evolution strategies, CMA-ES)。所導入之均值移動程序提供了群聚功能。這可以使我們同時提供多組的自適應共變異數矩陣演化策略模組,平行地探索解空間中不同的區域。本論文創新的地方在於,我們不需要事先給定所需要模組的數目。我們利用均值移動的分群功能,經由在自適應共變異數矩陣演化策略樣本中的核密度估測,來判定合適的自適應共變異數矩陣演化策略模組個數。將每個演算法模組視為解空間中個別的自適應共變異數矩陣演化策略代理者,增強了其全域最佳化的能力。我們將所提出之基於均值移動之自適應共變異數矩陣演化策略(mean shift based covariance matrix adaptation evolution strategies, MS-CMA-ES)套用在多漏斗函數(multi-funnel functions)之最佳化上,用來驗證本論文所提演算法之效能。實驗結果說明了演算法在多峰(multimodal)、多漏斗函數下可以得到不錯的實驗效能。
We introduce a computational module based on mean shift procedure into the evolution strategies with covariance matrix adaptation (CMA-ES). The introduced mean shift procedure provides functions of clustering which allows us to apply multiple CMA-ES instances to explore different parts of the search space in parallel. The novelty of our approach is that we do not require the number of CMA-ES instances as a parameter; instead, we apply a mean shift-based mode detection method to the kernel density estimator of the selected points from the CMA-ES samples to determine the number of CMA-ES instances. The global exploration ability is enhanced by the concept that each instance forms a separate CMA-ES agent to explore different parts of the search space. We evaluate the performance of the new mean shift-based evolution strategies with covariance matrix adaptation (MS-CMA-ES) on the optimization of multi-funnel functions. The new MS-CMA-ES algorithm shows better performance on non-convex, multi-funnel functions.
Keywords: mean shift, evolution strategy with covariance matrix adaptation, optimization.

目錄
中文摘要 i
英文摘要 ii
誌謝 iii
目錄 iv
圖目錄 vi
表目錄 vii
第一章 緒論 1
1.1 相關研究 2
1.2 研究動機 5
1.3 論文架構 5
第二章 相關知識 6
2.1 自適應共變異數矩陣演化策略 6
2.1.1 取樣分佈 9
2.1.2 平均值更新機制 9
2.1.3 共變異數矩陣更新機制 10
2.1.4 全域步長更新機制 10
2.2 核密度估計 12
2.3 均值移動 14
2.4 重點取樣 16
2.5 有限常態混合機率模型 17
第三章 基於均值移動之自適應共變異數矩陣演算法 19
3.1 想法起源與概念 19
3.2 系統主要流程與架構 22
3.3 常態混合機率模型取樣公式與方法 24
3.3.1 常態混合機率模型取樣公式 24
3.3.2 常態混合機率模型取樣方法 25
3.4 均值移動分群 28
3.5 改善分群結果 34
3.6 更新新常態混合機率模型 36
第四章 實驗結果與分析 39
4.1 全域搜尋能力實驗 39
4.1.1 多重漏斗型函式 40
4.1.1 實驗設計與結果 41
4.2 測試函式實驗 44
4.2.1 測試函式介紹 44
4.2.2 實驗結果與討論 46
4.2.3 門檻參數實驗結果 55
第五章 結論與未來工作 58
參考文獻 59
附錄一 測試函式定義 63
[1] M. Lunacek, D. Whitley, and A. Sutton, “The impact of global structure on search,”Parallel Problem Solving from Nature - PPSN X, vol. 5199, pp. 498-507, 2008.
[2] C. L. Muller and I. F. Sbalzarini, “A tunable real-world multi-funnel benchmark problem for evolutionary optimization and why parallel island models might remedy the failure of CMA-ES on it,” in Proc. of 2009 International Joint Conference on Computational Intelligence, Paris, France, pp. 248-253, 2009.
[3] D. J. Wales, “Energy landscapes and properties of biomolecules,” Physical Biology, vol. 2, pp. 86-93, Dec 2005.
[4] P. L. Clark, “Protein folding in the cell: reshaping the folding funnel,” Trends in Biochemical Sciences, vol. 29, pp. 527-534, Oct 2004.
[5] C. L. Muller, B. Benedikt, and F. S. Ivo, “Particle Swarm CMA Evolution Strategy for the optimization of multi-funnel landscapes,” in Proc. of 2009 IEEE Congress on Evolutionary Computation, New York, USA, pp. 2685-2692, 2009.
[6] D. Comaniciu and P. Meer, “Mean shift: A robust approach toward feature space analysis,” IEEE Transactions on Pattern Analysis and Machine Intelligence, vol. 24, pp. 603-619, May 2002.
[7] N. Hansen and A. Ostermeier, “Completely derandomized self-adaptation in evolution atrategies,” Evolutionary Computation, vol. 9, pp. 159-195, Jun 2001.
[8] N. Hansen, S. D. Muller, and P. Koumoutsakos, “Reducing the time complexity of the derandomized evolution atrategy with covariance matrix adaptation (CMA-ES),” Evolutionary Computation, vol. 11, pp. 1-18, Spr 2003.
[9] M. A. T. Figueiredo and A. K. Jain, “Unsupervised learning of finite mixture models,” IEEE Transactions on Pattern Analysis and Machine Intelligence, vol. 24, pp. 381-396, Mar 2002.
[10] C. M. Bishop, Pattern recognition and machine learning, New York: Springer, 2006.
[11] M. P. Wand and M. C. Jones, “Comparison of smoothing parameterizations in bivariate kernel density-estimation,” Journal of the American Statistical Association, vol. 88, pp. 520-528, Jun 1993.
[12] V. Granville, M. Kfivanek, and J. Rasson, “Simulated Annealing: a proof of convergence,” IEEE Transactions on Pattern Analysis and Machine Intelligence, vol. 16, no. 6, pp. 652-656, 1994.
[13] D. B. Fogel, T. Bäck, and Z. Michalewicz, Evolutionary Computation 1: Basic Algorithms and Operators, Taylor & Francis, 2000
[14] A. E. Eiben, and J. E. Smith, Introduction to evolutionary computing, Springer, 2003.
[15] H. G. Beyer and H. P. Schwefel, “Evolution strategies: A comprehensive
introduction,” Natural Computing, vol. 1, pp. 3-52, 2002.
[16] A. Auger and N. Hansen, “Performance evaluation of an advanced local search evolutionary algorithm,” in Proc. of 2005 IEEE Congress on Evolutionary Computation, New York, USA, pp. 1777-1784, 2005.
[17] A. Auger and N. Hansen, “A restart CMA evolution strategy with increasing population size,” in Proc. of 2005 IEEE Congress on Evolutionary Computation, New York, pp. 1769-1776, 2005.
[18] N. Hansen and S. Kern, “Evaluating the CMA evolution strategy on multimodal test functions,” Parallel Problem Solving from Nature - PPSN XIII, vol. 3242, pp. 282-291, 2004.
[19] C. T. Hsieh, C. M. Chen, and Y. P. Chen,“Particle Swarm Guided Evolution Strategy,” in Proc. of the 9th Annual Conference on Genetic and Evolutionary Computation(GECCO'07), New York, USA, 2007.
[20] R. S. Stephan, “Defining a standard for particle swarm optimization,”IEEE Swarm Intelligence Symposium, 2007, Honolulu, Hawaii, USA, pp. 120-128, April 2007.
[21] J. Kennedy, “The particle swarm: social adaptation of knowledge,” in Proc. of 1997 IEEE International Conference on Evolutionary Computation (ICEC '97), Indianapolis, IN, pp. 303-308, 1997.
[22] S. Kern, S. D. Müller, N. Hansen, D. Büche, J. Ocenasek, and P. Koumoutsakos, “Learning probability distributions in continuous evolutionary algorithms-a comparative review,” Natural Computing, vol. 3, no. 1, pp. 77-112, 2004.
[23] N. Hansen and A. Ostermeierm, “Adapting arbitrary normal mutation distributions in evolution strategies: The covariance matrix adaptation,” in Proc. of the 1996 IEEE International Conference on Evolutionary Computation, pp. 312-317, 1996.
[24] M. P. Wand and M. C. Jones, Kernel smoothing, London: Chapman & Hall, 1995.
[25] M. C. Jones, J. S. Marron, and S. J. Sheather, “A brief survey of bandwidth selection for density estimation,” Journal of the American Statistical Association, vol. 91, no. 433, pp. 401-407, Mar 1996.
[26] R. S. Stephan, “Multivariate locally adaptive density estimation,” Computational Statistics & Data Analysis, vol. 39, no. 2, pp. 165-186, Jan 2001.
[27] R. S. Stephan and D. W. Scott, “On locally adaptive density estimation,” Journal of the American Statistical Association, vol. 91, no. 436, pp. 1525-1534, Dec 1996.
[28] Y. Cheng, “Mean shift, mode seeking, and clustering,” IEEE Transactions on Pattern Analysis and Machine Intelligence, vol. 17, no. 8, pp. 790-799, Aug 1995.
[29] S. T. Tokdar and R. E. Kass, ”Importance sampling: a review,” Wiley Interdisciplinary Reviews: Computational Statistics, vol. 2, no. 1, pp. 54-60, Feb 2010.
[30] D. J. C. MacKay, Information theory, inference, and learning algorithms, Cambridge University Press, 2003.
[31] P. Zhang, “Nonparametric importance sampling,” Journal of the American Statistical Association, vol. 91, no. 435, pp. 1245-1253, Sep 1996.
[32] D, Koller and N, Friedman, probabilistic graphical models, MIT Press, 2009.
[33] M. Lynch, “Evolution of the mutation rate,” Trends in Genetic, vol. 26, no. 436, pp. 345–352, 2010.
[34] D. Comaniciu, “An algorithm for data-drivan bandwidth selection,” IEEE Transactions on Pattern Analysis and Machine Intelligence, vol. 25, no. 2, pp. 281-288, Feb 2003.
[35] D. Comaniciu, V. Ramesh, and P. Meer, “The variable bandwidth mean shift and data-driven scale scale selection,” in Proc. of 2001 Eighth IEEE International Conference on Computer Vision, Vancouver, BC, Canada, vol. 1, pp. 438 - 445, 20017.
[36] G. R. Terrell, “The maximal smoothing principle in density estimation,” Journal of the American Statistical Association, vol. 85, no. 7, pp. 470-480, Dec 1990.
[37] N. Hansen, A. Auger, S. Finck, and R. Ros, “Real-parameter black-box optimization benchmarking 2010: experimental setup,” INRIA research report RR-7215, Sep 2010.
[38] P. N. Suganthan, N. Hansen, J. J. Liang, K. Deb, Y. P. Chen, A. Auger, and S. Tiwari, “Problem definitions and evaluation criteria for the CEC 2005 special session on real-parameter optimization,” Nanyang Technological University, Singapore KanGAL Report #2005005, 2005.
[39] Y. Akimoto, Y. Nagata, I. Ono, and S. Kobayashi, “Bidirectional relation between CMA evolution strategies and natural evolution strategies,” Parallel Problem Solving from Nature – PPSN XI, vol. 6238, pp. 154-163, 2010.


連結至畢業學校之論文網頁點我開啟連結
註: 此連結為研究生畢業學校所提供,不一定有電子全文可供下載,若連結有誤,請點選上方之〝勘誤回報〞功能,我們會盡快修正,謝謝!
QRCODE
 
 
 
 
 
                                                                                                                                                                                                                                                                                                                                                                                                               
第一頁 上一頁 下一頁 最後一頁 top