Academia.eduAcademia.edu

Outline

Functional learning through kernels

2009

Abstract

This paper reviews the functional aspects of statistical learning theory. The main point under consideration is the nature of the hypothesis set when no prior information is available but data. Within this framework we first discuss about the hypothesis set: it is a vectorial space, it is a set of pointwise defined functions, and the evaluation functional on this set is a continuous mapping. Based on these principles an original theory is developed generalizing the notion of reproduction kernel Hilbert space to non hilbertian sets. Then it is shown that the hypothesis set of any learning machine has to be a generalized reproducing set. Therefore, thanks to a general "representer theorem", the solution of the learning problem is still a linear combination of a kernel. Furthermore, a way to design these kernels is given. To illustrate this framework some examples of such reproducing sets and kernels are given.

References (20)

  1. D.A. Alpay, Some krein spaces of analytic functions and an inverse scattering problem, Michigan Journal of Mathematics 34 (1987) 349-359.
  2. N. Aronszajn. Theory of reproducing kernels, Transactions of the American Society 68 (1950) 337-404.
  3. M. Attéia, Hilbertian kernels and spline functions, North-Holland (1992).
  4. M. Attéia and J. Audounet, Inf-compact potentials and banachic kernels, In Banach space theory and its applications, volume 991 of Lecture notes in mathematics, Springer-Verlag (1981) 7-27.
  5. F. Cucker and S. Smale, On the mathematical foundations of learning. Bulletin of the American Mathematical Society 39 (2002) 1-49.
  6. R. M. Dudley, Uniform central limit theorems, Cambridge university press (1999).
  7. D. Evans and J.T. Lewis, Dilations of irreversible evolutions in algebraic quantum theory, Communications of the Dublin Institute for advanced Studies, Series A., 24 (1977).
  8. F. Girosi, An equivalence between sparse approximation and support vector machines, Neural Computation 10(6) (1998) 1455-1480.
  9. G. Kimeldorf and G. Wahba, Some results on Tchebycheffian spline functions, J. Math. Anal. Applic. 33 (1971) 82-95.
  10. X. Mary, D. De Brucq and S. Canu, Sous-dualités et noyaux (reproduisants) associés, Technical report PSI 02-006 (2002). available at asi.insa-rouen.fr/˜scanu
  11. J. Mercer, Functions of positive and negative type and their connection with the theory of integral equations, Transactions of the London Philosophical Society A 209 (1909) 415-446.
  12. J. Neveu, Processus aléatoires gaussiens, Séminaires de mathématiques supérieures, Les presses de l'université de Montréal (1968).
  13. T. Poggio and F. Girosi, A theory of networks for approximation and learning, Technical Report AIM-1140 (1989).
  14. S. Saitoh, Theory of reproducing kernels and its applications, volume 189. Longman scientific and technical (1988).
  15. B. Schölkopf, A generalized representer theorem, Technical Report 2000-81, NeuroColt2 Technical Report Series (2000).
  16. L. Schwartz, Sous espaces hilbertiens d'espaces vectoriels topologiques et noyaux associés, Journal d'Analyse Mathématique (1964) 115-256.
  17. A.J. Smola and B. Schölkopf, From regularization operators to support vector kernels, In M.I. Jordan, M.J. Kearns, and S.A. Solla, editors, Advances in Neural Information Processing Systems, volume 10. The MIT Press (1998).
  18. G.I. Targonski, On Carleman integral operators, Proceedings of the American Mathematical Society, 18(3) (1967) 450-456.
  19. V. Vapnik, The Nature of Statistical Learning Theory. Springer, N.Y (1995).
  20. G. Wahba, Spline Models for Observational Data, Series in Applied Mathematics, Vol. 59, SIAM, Philadelphia (1990).