|
[1] M. Lunacek, D. Whitley, and A. Sutton, “The impact of global structure on search,”Parallel Problem Solving from Nature - PPSN X, vol. 5199, pp. 498-507, 2008. [2] C. L. Muller and I. F. Sbalzarini, “A tunable real-world multi-funnel benchmark problem for evolutionary optimization and why parallel island models might remedy the failure of CMA-ES on it,” in Proc. of 2009 International Joint Conference on Computational Intelligence, Paris, France, pp. 248-253, 2009. [3] D. J. Wales, “Energy landscapes and properties of biomolecules,” Physical Biology, vol. 2, pp. 86-93, Dec 2005. [4] P. L. Clark, “Protein folding in the cell: reshaping the folding funnel,” Trends in Biochemical Sciences, vol. 29, pp. 527-534, Oct 2004. [5] C. L. Muller, B. Benedikt, and F. S. Ivo, “Particle Swarm CMA Evolution Strategy for the optimization of multi-funnel landscapes,” in Proc. of 2009 IEEE Congress on Evolutionary Computation, New York, USA, pp. 2685-2692, 2009. [6] D. Comaniciu and P. Meer, “Mean shift: A robust approach toward feature space analysis,” IEEE Transactions on Pattern Analysis and Machine Intelligence, vol. 24, pp. 603-619, May 2002. [7] N. Hansen and A. Ostermeier, “Completely derandomized self-adaptation in evolution atrategies,” Evolutionary Computation, vol. 9, pp. 159-195, Jun 2001. [8] N. Hansen, S. D. Muller, and P. Koumoutsakos, “Reducing the time complexity of the derandomized evolution atrategy with covariance matrix adaptation (CMA-ES),” Evolutionary Computation, vol. 11, pp. 1-18, Spr 2003. [9] M. A. T. Figueiredo and A. K. Jain, “Unsupervised learning of finite mixture models,” IEEE Transactions on Pattern Analysis and Machine Intelligence, vol. 24, pp. 381-396, Mar 2002. [10] C. M. Bishop, Pattern recognition and machine learning, New York: Springer, 2006. [11] M. P. Wand and M. C. Jones, “Comparison of smoothing parameterizations in bivariate kernel density-estimation,” Journal of the American Statistical Association, vol. 88, pp. 520-528, Jun 1993. [12] V. Granville, M. Kfivanek, and J. Rasson, “Simulated Annealing: a proof of convergence,” IEEE Transactions on Pattern Analysis and Machine Intelligence, vol. 16, no. 6, pp. 652-656, 1994. [13] D. B. Fogel, T. Bäck, and Z. Michalewicz, Evolutionary Computation 1: Basic Algorithms and Operators, Taylor & Francis, 2000 [14] A. E. Eiben, and J. E. Smith, Introduction to evolutionary computing, Springer, 2003. [15] H. G. Beyer and H. P. Schwefel, “Evolution strategies: A comprehensive introduction,” Natural Computing, vol. 1, pp. 3-52, 2002. [16] A. Auger and N. Hansen, “Performance evaluation of an advanced local search evolutionary algorithm,” in Proc. of 2005 IEEE Congress on Evolutionary Computation, New York, USA, pp. 1777-1784, 2005. [17] A. Auger and N. Hansen, “A restart CMA evolution strategy with increasing population size,” in Proc. of 2005 IEEE Congress on Evolutionary Computation, New York, pp. 1769-1776, 2005. [18] N. Hansen and S. Kern, “Evaluating the CMA evolution strategy on multimodal test functions,” Parallel Problem Solving from Nature - PPSN XIII, vol. 3242, pp. 282-291, 2004. [19] C. T. Hsieh, C. M. Chen, and Y. P. Chen,“Particle Swarm Guided Evolution Strategy,” in Proc. of the 9th Annual Conference on Genetic and Evolutionary Computation(GECCO'07), New York, USA, 2007. [20] R. S. Stephan, “Defining a standard for particle swarm optimization,”IEEE Swarm Intelligence Symposium, 2007, Honolulu, Hawaii, USA, pp. 120-128, April 2007. [21] J. Kennedy, “The particle swarm: social adaptation of knowledge,” in Proc. of 1997 IEEE International Conference on Evolutionary Computation (ICEC '97), Indianapolis, IN, pp. 303-308, 1997. [22] S. Kern, S. D. Müller, N. Hansen, D. Büche, J. Ocenasek, and P. Koumoutsakos, “Learning probability distributions in continuous evolutionary algorithms-a comparative review,” Natural Computing, vol. 3, no. 1, pp. 77-112, 2004. [23] N. Hansen and A. Ostermeierm, “Adapting arbitrary normal mutation distributions in evolution strategies: The covariance matrix adaptation,” in Proc. of the 1996 IEEE International Conference on Evolutionary Computation, pp. 312-317, 1996. [24] M. P. Wand and M. C. Jones, Kernel smoothing, London: Chapman & Hall, 1995. [25] M. C. Jones, J. S. Marron, and S. J. Sheather, “A brief survey of bandwidth selection for density estimation,” Journal of the American Statistical Association, vol. 91, no. 433, pp. 401-407, Mar 1996. [26] R. S. Stephan, “Multivariate locally adaptive density estimation,” Computational Statistics & Data Analysis, vol. 39, no. 2, pp. 165-186, Jan 2001. [27] R. S. Stephan and D. W. Scott, “On locally adaptive density estimation,” Journal of the American Statistical Association, vol. 91, no. 436, pp. 1525-1534, Dec 1996. [28] Y. Cheng, “Mean shift, mode seeking, and clustering,” IEEE Transactions on Pattern Analysis and Machine Intelligence, vol. 17, no. 8, pp. 790-799, Aug 1995. [29] S. T. Tokdar and R. E. Kass, ”Importance sampling: a review,” Wiley Interdisciplinary Reviews: Computational Statistics, vol. 2, no. 1, pp. 54-60, Feb 2010. [30] D. J. C. MacKay, Information theory, inference, and learning algorithms, Cambridge University Press, 2003. [31] P. Zhang, “Nonparametric importance sampling,” Journal of the American Statistical Association, vol. 91, no. 435, pp. 1245-1253, Sep 1996. [32] D, Koller and N, Friedman, probabilistic graphical models, MIT Press, 2009. [33] M. Lynch, “Evolution of the mutation rate,” Trends in Genetic, vol. 26, no. 436, pp. 345–352, 2010. [34] D. Comaniciu, “An algorithm for data-drivan bandwidth selection,” IEEE Transactions on Pattern Analysis and Machine Intelligence, vol. 25, no. 2, pp. 281-288, Feb 2003. [35] D. Comaniciu, V. Ramesh, and P. Meer, “The variable bandwidth mean shift and data-driven scale scale selection,” in Proc. of 2001 Eighth IEEE International Conference on Computer Vision, Vancouver, BC, Canada, vol. 1, pp. 438 - 445, 20017. [36] G. R. Terrell, “The maximal smoothing principle in density estimation,” Journal of the American Statistical Association, vol. 85, no. 7, pp. 470-480, Dec 1990. [37] N. Hansen, A. Auger, S. Finck, and R. Ros, “Real-parameter black-box optimization benchmarking 2010: experimental setup,” INRIA research report RR-7215, Sep 2010. [38] P. N. Suganthan, N. Hansen, J. J. Liang, K. Deb, Y. P. Chen, A. Auger, and S. Tiwari, “Problem definitions and evaluation criteria for the CEC 2005 special session on real-parameter optimization,” Nanyang Technological University, Singapore KanGAL Report #2005005, 2005. [39] Y. Akimoto, Y. Nagata, I. Ono, and S. Kobayashi, “Bidirectional relation between CMA evolution strategies and natural evolution strategies,” Parallel Problem Solving from Nature – PPSN XI, vol. 6238, pp. 154-163, 2010.
|