Academia.eduAcademia.edu

Outline

Optimization in Reproducing Kernel Hilbert Spaces of Spike Trains

2010, Springer Optimization and Its Applications

https://doi.org/10.1007/978-0-387-88630-5_1

Abstract

This paper presents a framework based on reproducing kernel Hilbert spaces (RKHS) for optimization with spike trains. To establish the RKHS for optimization we start by introducing kernels for spike trains. It is shown that spike train kernels can be built from ideas of kernel methods, or from the intensity functions underlying the spike trains. However, the later approach shall be the main focus of this study. We introduce the memoryless cross-intensity (mCI) kernel as an example of an inner product of spike trains, which defines the RKHS bottom-up as an inner product of intensity functions. Being defined in terms of the intensity functions, this approach towards defining spike train kernels has the advantage that points in the RKHS incorporate a statistical description of the spike trains, and the statistical model is explicitly stated. Some properties of the mCI kernel and the RKHS it induces will be given to show that this RKHS has the necessary structure for optimization. The issue of estimation from data is also addressed. We finalize with an example of optimization in the RKHS by deriving an algorithm for principal component analysis (PCA) of spike trains.

References (38)

  1. Peter Dayan and L. F. Abbott. Theoretical Neuroscience: Computational and Mathe- matical Modeling of Neural Systems. MIT Press, Cambridge, MA, USA, 2001.
  2. Wolfgang Maass, Thomas Natschläger, and Henry Markram. Real-time computing without stable states: A new framework for neural computation based on perturbations. Neural Comp., 14(11):2531-2560, 2002. doi: 10.1162/089976602760407955.
  3. Wolfgang Maass and Christopher M. Bishop, editors. Pulsed Neural Networks. MIT Press, 1998.
  4. Sander M. Bohte, Joost N. Kok, and Han La Poutré. Error-backpropagation in tempo- rally encoded networks of spiking neurons. Neurocomp., 48(1-4):17-37, October 2002. doi: 10.1016/S0925-2312(01)00658-0.
  5. Jonathan D. Victor and Keith P. Purpura. Nature and precision of temporal coding in visual cortex: A metric-space analysis. J. Neurophysiol., 76(2):1310-1326, August 1996.
  6. Jonathan D. Victor and Keith P. Purpura. Metric-space analysis of spike trains: theory, algorithms, and application. Network: Comp. Neural Sys., 8:127-164, October 1997.
  7. M. C. W. van Rossum. A novel spike distance. Neural Comp., 13(4):751-764, 2001.
  8. Andrew Carnell and Daniel Richardson. Linear algebra for time series of spikes. In Proc. European Symp. on Artificial Neural Networks, pages 363-368, Bruges, Belgium, April 2005.
  9. Benjamin Schrauwen and Jan Van Campenhout. Linking non-binned spike train kernels to several existing spike train distances. Neurocomp., 70(7-8):1247-1253, March 2007. doi: 10.1016/j.neucom.2006.11.017.
  10. António R. C. Paiva, Il Park, and José C. Príncipe. Reproducing kernel Hilbert spaces for spike train analysis. In Proc. IEEE Int. Conf. on Acoustics, Speech, and Signal Processing, ICASSP-2008, Las Vegas, NV, USA, April 2008.
  11. Emanuel Parzen. Statistical inference on time series by Hilbert space methods. Tech- nical Report 23, Applied Mathematics and Statistics Laboratory, Stanford University, Stanford, California, January 1959.
  12. Emanuel Parzen. Statistical inference on time series by RKHS methods. In Ronald Pyke, editor, Proc. 12th Biennal Int. Seminar of the Canadian Mathematical Congress, pages 1-37, 1970.
  13. Thomas Kailath. RKHS approach to detection and estimation problems-part I: De- terministic signals in gaussian noise. IEEE Trans. Inform. Theory, 17(5):530-549, September 1971.
  14. Thomas Kailath and Howard L. Weinert. An RKHS approach to detection and esti- mation problems-part II: Gaussian signal detection. IEEE Trans. Inform. Theory, 21 (1):15-23, January 1975.
  15. Thomas Kailath and Donald L. Duttweiler. An RKHS approach to detection and estimation problems-part III: Generalized innovations representations and a likelihood- ratio formula. IEEE Trans. Inform. Theory, 18(6):730-745, November 1972.
  16. Bernhard Schölkopf, Christopher J. C. Burges, and Alexander J. Smola, editors. Ad- vances in Kernel Methods: Support Vector Learning. MIT Press, 1999.
  17. Vladimir N. Vapnik. The Nature of Statistical Learning Theory. Springer, 1995.
  18. Grace Wahba. Spline Models for Observational Data, volume 59 of CBMS-NSF Regional Conference Series in Applied Mathematics. SIAM, 1990.
  19. Simon Haykin. Adaptive Filter Processing. Prentice-Hall, 4nd edition, 2002.
  20. N. Aronszajn. Theory of reproducing kernels. Trans. Am. Math. Soc., 68(3):337-404, May 1950.
  21. E. H. Moore. On properly positive Hermitian matrices. Bull. Am. Math. Soc., 23:59, 1916.
  22. J. Mercer. Functions of positive and negative type, and their connection with the theory of integral equations. Phil. Trans. R. Soc. London -A, 209:415-446, 1909.
  23. Emanuel Parzen. Time Series Analysis Papers. Holden-Day, San Francisco, CA, 1967.
  24. José C. Príncipe, Dongxin Xu, and John W. Fisher. Information theoretic learning. In Simon Haykin, editor, Unsupervised Adaptive Filtering, volume 2, pages 265-319. John Wiley & Sons, 2000.
  25. Christian Berg, Jens Peter Reus Christensen, and Paul Ressel. Harmonic Analysis on Semigroups: Theory of Positive Definite and Related Functions. Springer-Verlag, New York, NY, 1984.
  26. D. L. Snyder. Random Point Process in Time and Space. John Viley & Sons, New York, 1975.
  27. Fred Rieke, David Warland, Rob de Ruyter van Steveninck, and William Bialek. Spikes: exploring the neural code. MIT Press, Cambridge, MA, USA, 1999. ISBN 0-262-18174-6.
  28. S. Schreiber, J. M. Fellous, D. Whitmer, P. Tiesinga, and T. J. Sejnowski. A new correlation-based measure of spike timing reliability. Neurocomp., 52-54:925-931, June 2003. doi: 10.1016/S0925-2312(02)00838-X.
  29. Rolf-Dieter Reiss. A Course on Point Processes. Springer-Verlag, New York, NY, 1993.
  30. Emanuel Parzen. On the estimation of a probability density function and the mode. Annals Math. Stat., 33(2):1065-1076, September 1962.
  31. Peter Diggle and J. S. Marron. Equivalence of smoothing parameter selectors in density and intensity estimation. J. Acoust. Soc. Am., 83(403):793-800, September 1988.
  32. Kanti V. Mardia and Peter E. Jupp. Directional Statistics. John Wiley & Sons, West Sussex, England, 2000. ISBN 0-471-95333-4.
  33. B. Schölkopf, A. Smola, and K.-R. Müller. Nonlinear component analysis as a kernel eigenvalue problem. Neural Comp., 10(5):1299-1319, 1998.
  34. António R. C. Paiva, Jian-Wu Xu, and José C. Príncipe. Kernel principal compo- nents are maximum entropy projections. In Proc. Int. Conf. on Independent Compo- nent Analysis and Blind Source Separation, ICA-2006, pages 846-853, Charleston, SC, March 2006. doi: 10.1007/11679363 105.
  35. J. O. Ramsay and B. W. Silverman. Functional Data Analysis. Springer-Verlag, 1997. ISBN 0-387-94956-9.
  36. Barry J. Richmond and Lance M. Optican. Temporal encoding of two-dimensional pat- terns by single units in primate inferior temporal cortex. II. Quantification of response waveform. J. Neurophysiol., 51(1):147-161, January 1987.
  37. John W. McClurkin, Timothy J. Gawne, Lance M. Optican, and Barry J. Richmond. Lateral geniculate neurons in behaving primates. II. Encoding of visual information in the temporal shape of the response. J. Neurophysiol., 66(3):794-808, September 1991.
  38. David A. Harville. Matrix algebra from a statistician's perspective. Springer, 1997.