|
1.R. K. Belew and L. B. Booker, eds., Proceeding of the Four International Conference on Genetic Algorithms, Morgan Kaufmann, 1991. 2.J. Holland, Adaptation in Natural and Artificial systems, Ann Arbor, MI: University of Michigan Press, 1975. 3.T. Back, U. Hammel and H.P. Schwefel, “Evolutionary computation: comments on the history and current state,” IEEE Trans. on Evolutionary Computation, vol. 1, no. 1, pp. 3-17, Apr. 1997. 4.S. Chen, Y. Wu and B. L. Luk, “Combined genetic algorithm optimization and regularized orthogonal least squares learning for radial basis function networks,” IEEE Trans. on Neural Networks, vol. 10, no. 5, pp. 1239-1243. 1999. 5.S. Aiguo and L. Jiren, “Evolving Gaussian RBF network for nonlinear time series modeling and prediction,” IEEE Electronics Letters, vol. 34, no. 12, pp. 1241-1243, June 1998. 6.B. Yunfei and L. Zhang, “Genetic algorithm based self-growing training for RBF neural Network,” IEEE Neural Networks, vol. 1, pp. 840-845, 2002. 7.D. J. Montana and L. Davis, “Training feedforward neural networks using genetic algorithms,” in Proc. of the International Joint Conference on Artificial Intelligence, pp. 762-767, 1989 8.S. Kirkpatrick, C. D. Gelatt, Jr. and M.P. Vecchi, “Optimization by Simulated Annealing,” Science, vol. 220, no. 4598, pp. 671-680, 1983. 9.E. Harts and K. Kost, Simulated annealing and Bolbman machines, JOHN WILEY & SONS, New York, 1989. 10.C. S. Koh , Song Yop Hahn and O. A. Mohammed, “Detection of magnetic body using artificial neural network with modified simulated annealing,” IEEE Transactions on Magnetics, vol. 30 no. 5 pp.3644-3647, 1994 11.Y. L. Mao, G. Z. Zhang, B. Zhu and M. Zhou, “Chaotic simulated annealing neural network with decaying chaotic noise and its application in economic load dispatch of power systems,” in Proc. of 2004 IEEE International Conference on Information Reuse and Integration, pp. 536 -542, 2004 12.M. Dorigo and T. Stutzle, Ant Colony Optimization. MIT Press, Cambridge, MA, 2004. 13.B. Bilchev and I. C. Parmee, “The ant colony metaphor for searching continuous design spaces,” in Proc. of the AISB Workshop on Evolutionary Computation, ser. LNCS, vol. 993, pp. 25-39, 1995. 14.N. Monmarch’e, G. Venturini and M. Slimane, “On how Pachycondyla apicalis ants suggest a new search algorithm,” Future Generation Computer Systems, vol. 16, Issue. 8, pp. 937-946, June, 2000. 15.J. Dr’eo and P. Siarry, “A new ant colony algorithm using the heterarchical concept aimed at optimization of multiminima continuous functions,” in Proc. of ANTS 2002, ser. LNCS, M. Dorigo et al., Eds., vol. 2463. Springer Verlag, Berlin, Germany, pp. 216-221, 2002. 16.K. Socha, “Extended ACO for continuous and mixedvariable optimization,” in Proc. of ANTS 2004, ser. LNCS, M. Dorigo et al., Eds. Springer Verlag, Berlin, Germany, pp.25-36, 2004. 17.C. Blum and K. Socha, “Training feed-forward neural networks with ant colony optimization: An application to pattern classification,” in Proc. of Hybrid Intelligent Systems Conference, HIS-2005, Rio de Janeiro, Brazil, Nov. 6-9, pp. 6, 2005. 18.J. Kennedy and R. C. Eberhart, “Particle swarm optimization”, in Proc. of IEEE International Conference on neural networks, Perth, Australia, pp. 1942-1948, 1995 19.V.G. Gudise and G.K. Venayagamoorthy, “Comparison of Particle Swarm Optimization and Backpropagation as Training Algorithms for Neural Networks,” in Proc. of the IEEE Symposium on Swarm Intelligence, Indianapolis, IN, USA, pp 110-117, April 24-26, 2003 20.W. Zha and G. K. Venayagamoorthy,“ Neural networks based non-uniform scalar quantizer design with particle swarm optimization, ” in Proc. of the IEEE Swarm Intelligence Symposium, 2005. SIS 2005, vol. 8, no. 10, pp. 143-148, June 2005. 21.A. Kazemi and C.K. Mohan, “Training Feedforward Neural Networks using Multi-Phase Particle Swarm Optimization,” in Proc. the Ninth International Conference on Neural Information Processing, vol. 5, pp. 2615-2619, 2002. 22.V. G. Gudise and G. K. Venayagamoorthy, “Comparison of particle swarm optimization and backpropagation as training algorithms for neural networks,” in Proc. of IEEE Swarm Intelligence Symposium, vol. 24, no. 26, pp. 110-117, 2003. 23.T. Y. Sun, S. T. Hsieh and C. W. Lin “Particle Swarm Optimization Incorporated with Disturbance for Improving the Efficiency of Macrocell Overlap Removal and Placement,” in Proc. of the 2005 International Conference on Artificial Intelligence (ICAI’05), pp. 122-125, June 2005 24.張孝德和蘇木春, 機器學習類神經網路、模糊系統以及基因演算法則, 全華, 1997. 25.J. Kennedy, Small worlds and Mega-Minds, “Effects of Neighborhood Topology on Particle Swarm Performance,” IEEE congress on Evolutionary Computation, vol. 3, pp. 1937-1938, 1999 26.C. L. Lin, S. T. Hsieh, T. Y. Sun and C. C. Liu, “PSO-based learning rate adjustment for blind source separation,” in Proc. of International Symposium on Intelligent Signal Processing and Communications Systems (ISPACS), pp. 181-184, Dec. 2005. 27.F van den Bergh and AP Engelbrecht, “Cooperative Learning in Neural Networks using Particle Swarm Optimizers,” South African Computer Journal, no. 26, pp. 84-90, 2000 28.A. Cichocki and S. I Amari, Adaptive Blind Signal and Image Processing, Wiley, 2002 29.J. B. Anthony, and J. S. Terrence, “Learning the higher-order structure of a natural sound,” Network: Computation in Neural Systems, pp. 261-266, 1996. 30.J. B. Anthony, and J. S. Terrence, “An information_maximisation approach to blind separation and blind deconvolution,” Neural Computation INC-9501, vol. 7, no. 6, pp. 1129-1159, 1995. 31.S. Amari, “Theory of adaptive patter classifiers,” IEEE Tran. on Electr. Comput, vol. EC-16, pp. 299-307, 1967. 32.S. C. Dougles and A. Cichocki, “Adaptive step size techniques for decorrelation and blind source separation ,” in Proc. 32nd Asilomar Conf. Signals, Systems, Computers, vol. 2, pp. 1991-1995, Pacific Grove, CA, Nov. 1998 33.S. T. Lou and X. D. Zhang, “Fuzzy-Based Learning Rate Determination for Blind Source Separation,” IEEE Transactions on Fuzzy System, vol. 11, no.3, pp. 375-383, June 2003 34.P. Comon, “Independent component analysis, a new concept,” IEEE Trans. Signal Processing, vol. 36, pp. 287-314, 1994 35.J. F. Cardoso and B. H. Laheld, “Equivariant adaptive source separation,” IEEE Trans. Signal Processing, vol. 44, pp. 3017-3030, Dec. 1996 36.S. Amari, A. Cichocki and H.-H. Yang, “A new learning algorithm for blind signal separation,” in Advanced in Neural Information Processing System, Cambridge, MA: MIT press, vol. 8, pp. 752-763, 1996 37.S. Cruces, A. Cichocki and L. Castedo, “An iterative inversion approach to blind source separation,” IEEE Trans. Neural Networks, vol. 11, pp. 1423-1437, Nov. 2000. 38.A. Cichocki and R. Unbehauen, “Robust neural networks with on-line learning for blind identification and blind separation of sources,” IEEE Trans. Circuits Syst. I, vol. 43, pp. 894-906, Oct. 1996. 39.S. Amari, “Theory of adaptive pattern classifiers,” IEEE Trans. Electr.Comput., vol. EC-16, pp. 299-307, 1967. 40.N. Murata, K. Müller, A. Ziehe and S. Amari, “Adaptive on-line learning in changing environments,” in Advances in Neural Information Processing Systems 9. Cambridge, MA: MIT Press, pp. 599-605, 1997. 41.S. C. Douglas and A. Cichocki, “Adaptive step seze techniques for decorrelation and blind source separation,” in Proc. 32nd Asilomar Conf. Signals, Systems, Computers, vol. 2, Pacific Grove, CA, Nov. pp. 1191-1195, 1998. 42.S. Amari and A. Cichocki, “Adaptive blind signal processing—Neural network approaches,” Proc. IEEE, vol. 86, pp. 2026-2048, 1998. 43.C. L. Lin, S. T. Hsieh, T. Y. Sun and C. C. Liu, “PSO-based learning rate adjustment for blind source separation,” in Proc. of International Symposium on Intelligent Signal Processing and Communications Systems (ISPACS), pp. 181-184 , Dec. 2005. 44.D.S. Broomhead and D. Lowe, “Multivariable functional interpolation and adaptive networks,” Complex Systems, vol. 2, pp. 321-355, 1988. 45.D. Lowe, “Adaptive radial basis function nonlinearities, and the problem of generalization,” In IEE International Conference on Artifical Neural Networks, pp. 171-175, London, UK, 1989. 46.J. A. S. Freeman and D. Saad, “Learning and generalization in radial basis function networks,” Neural Computation, vol. 9, no. 7, pp. 1601-1622, 1995. 47.Y. Moddy and C. J. Darken, “Fast learning in network of locally tuned processing unites,” Neural computation, vol.1, pp. 281-294, 1989. 48.N. B. Karayiannis and G. W. Mi, “Growing radial basis neural networks: merging supervised and unsupervised learning with network growth techniques,” IEEE Trans. on Neural Networks, vol. 8, no.6, pp. 1942-1506, Nov. 1997. 49.N. Zheng, Z. Zhang, G. Shi and Y. Qiao, “Self-creating and adaptive learning of RBF networks: merging soft-completion clustering algorithm with network growth technique,” International Joint Conference on Neural Networks (IJCNN’99), vol. 2, pp. 1131-1135, 1999. 50.S. Chen, “Nonlinear time series modeling and prediction using Gaussian RBF networks with enhances clustering and RLS learning,” Electronics Letters, vol. 31, no. 2, pp. 117-118, 1995. 51.A. Song and J. Lu, “Evolving Gaussian RBF network for nonlinear time series modeling and prediction,” Electronics Letters, vol. 34, no.12, pp. 1241-4243, 1988.
|