Academia.eduAcademia.edu

Outline

Optimal transfer function neural networks

2001, 9th European Symposium on Artificial …

Abstract

Neural networks use neurons of the same type in each layer but such architecture cannot lead to data models of optimal complexity and accuracy. Networks with architectures (number of neurons, connections and type of neurons) optimized for a given problem are described here. Each neuron may implement transfer function of different type. Complexity of such networks is controlled by statistical criteria and by adding penalty terms to the error function. Results of numerical experiments on artificial data are reported.

References (9)

  1. C. M. Bishop. Neural Networks for Pattern Recognition. Oxford University Press, 1995.
  2. G. Dorffner. A unified framework for MLPs and RBFNs: Introducing conic sec- tion function networks. Cybernetics and Systems, 25(4):511-554, 1994.
  3. W. Duch and N. Jankowski. Survey of neural transfer functions. Neural Comput- ing Surveys, 2:163-212, 1999.
  4. E. Fiesler. Comparative bibliography of ontogenic neural networks. In Proceed- ings of the International Conference on Artificial Neural Networks, pages 793- 796, 1994.
  5. N. Jankowski. Approximation with RBF-type neural networks using flexible local and semi-local transfer functions. In 4th Conference on Neural Networks and Their Applications, pages 77-82, Zakopane, Poland, May 1999.
  6. N. Jankowski and V. Kadirkamanathan. Statistical control of RBF-like networks for classification. In 7th International Conference on Artificial Neural Networks, pages 385-390, Lausanne, Switzerland, October 1997. Springer-Verlag.
  7. N. Jankowski and V. Kadirkamanathan. Statistical control of growing and pruning in RBF-like neural networks. In Third Conference on Neural Networks and Their Applications, pages 663-670, Kule, Poland, October 1997.
  8. T. Poggio and F. Girosi. Network for approximation and learning. Proceedings of the IEEE, 78:1481-1497, 1990.
  9. A. S. Weigend, D. E. Rumelhart, and B. A. Huberman. Generalization by weight elimination with application to forecasting. In R. P. Lipmann, J. E. Moody, and D. S. Touretzky, editors, Advances in Neural Information Processing Systems 3, pages 875-882, San Mateo, CA, 1991. Morgan Kaufmann.