|
[Bor.1]A. G. Bors, “Introduction of the radial basis function networks,” Online Symposium for Electronics Engineers, vol. 1, pp. 1-7, 2001. [Chu.1]C. C. Chuang, J. T. Jeng, and P. T. Lin, “Annealing robust radial basis function networks for function approximation with outliers,” Neurocomputing, vol. 56, pp. 123-139, 2004. [Far.1]J. J. Faraway, Extending the Linear Model with R Models. CRC Press, Boca Raton, Florida, 2006. [Hart.1]E. J. Hartman, J. D. Keeler, and J. M. Kowalski, “Layered neural networks with Gaussian hidden units as universal approximations,” Neural Computation, vol. 2, no. 2, pp. 210-215, 1990. [Härd.1]W. Härdle, M. Muller, S. Sperlich, and A. Werwatz, Nonparametric and Semiparametric Models. Springer, Berlin, Germany, 2004. [Has.1]T. J. Hastie and R. J. Tibshirani, Generalized Additive Models. Chapman & Hall, London, 1990. [Hsi.1]J. G. Hsieh, Y. L. Lin, and J. H. Jeng, “Preliminary study on Wilcoxon learning machines,” IEEE Transactions on Neural Networks, vol. 19, no. 2, pp. 201-211, 2008. [Lin.1]Y. L. Lin, SVM-based Robust Template Design of Cellular Neural Networks and Primary Study of Wilcoxon Learning Machines. PhD Thesis, Department of Electrical Engineering, National Sun Yat-Sen University, Kaohsiung, Taiwan, 2006. [Kec.1]V. Kecman, Learning and Soft Computing. MIT Press, Cambridge, Massachusetts, 2001. [Mam.1]E. Mammen and B. U. Park, “A simple smooth backfitting method for additive model,” Annals of Statistics, vol. 34, no. 5, pp. 2252-2271, 2006. [Mon.1]D. C. Montgomery, E. A. Peck, and G. G. Vining, Introduction to Linear Regression Analysis. 4th ed., Wiley, Hoboken, New Jersey, 2006. [Mor.1]V. A. Morozov, Regularization Methods for Ill-posed Problems. CRC Press, Boca Raton, Florida, 1993. [Nad.1]E. A. Nadaraya, “On estimating regression,” Theory of Probability and its Applications, vol. 9, pp. 141-142, 1964. [Park.1]J. Park and I. W. Sandberg, “Universal approximation using radial basis function networks,” Neural Computation, vol. 3, no. 2, pp. 246-257, 1991. [Pog.1]T. Poggio and F. Girosi, “A theory of networks for approximation and learning,” A.I. Memo No. 1140 MIT, Cambridge, 1989. [Pog.2]T. Poggio and F. Girosi, “Networks and the best approximation property,” A.I. Memo No. 1164 MIT, Cambridge, 1989. [Pog.3]T. Poggio and F. Girosi, “Regularization algorithms for learning that are equivalent to multilayer networks,” Science, vol. 247, no. 4945, pp. 978-982, 1990. [Pog.4]T. Poggio and F. Girosi, “Networks for approximation and learning,” Proceedings of the IEEE, vol. 78, pp. 1481-1497, 1990. [Pog.5]T. Poggio and F. Girosi, “Extensions of a theory of networks for approximation and learning: Dimensionality reduction and clustering,” A.I. Memo No. 1167 MIT, Cambridge, 1990. [Rum.1]D. E. Rumelhart, G. E. Hinton, and R. J. Williams, “Learning representations by back-propagation error,” Nature, vol. 323, pp. 533-536, 1986. [Rup.1]D. Ruppert, M. P. Wand, and R. J. Carroll, Semiparametric Regression. Cambridge University Press, New York, 2003. [Sto.1]C. Stone, “Additive regression and other nonparametric models,” Annals of Statistics, vol. 13, no. 2, pp. 689-705, 1985. [Sun.1]N. Sundararajan, P. Saratchandran, and Y. W. Lu, Radial Basis Function Neural Networks with Sequential Learning. World Scientific, Singapore, 1999. [Tik.1]A. N. Tikhonov, “On solving incorrectly posed problems and method of regularization,” Doklady Akademii Nauk USSR, vol. 151, pp. 501-504, 1963. [Tik.2]A. N. Tikhonov and V. Y. Arsenin, Solutions of Ill-posed Problems. V. H. Winston, Washington, D. C., 1977.
[Wat.1]G. S. Watson, “Smooth regression analysis,” Sankhya, Series A, vol. 26, pp. 359-372, 1964. [Wer.1]P. J. Werbors, Beyond Regression: New Tools for Prediction and Analysis in the Behavioral Sciences. PhD Thesis, Harvard University, Cambridge, 1974.
|