Multiple kernel nonnegative matrix factorization
2011, 2011 IEEE International Conference on Acoustics, Speech and Signal Processing (ICASSP)
https://doi.org/10.1109/ICASSP.2011.5946897Abstract
Kernel nonnegative matrix factorization (KNMF) is a recent kernel extension of NMF, where matrix factorization is carried out in a reproducing kernel Hilbert space (RKHS) with a feature mapping φ(•). Given a data matrix X ∈ R m×n , KNMF seeks a decomposition, φ(X) ≈ U V ⊤ , where the basis matrix takes the form U = φ(X)W and parameters W ∈ R n×r + and V ∈ R n×r + are estimated without explicit knowledge of φ(•). As in most of kernel methods, the performance of KNMF also heavily depends on the choice of kernel. In order to alleviate the kernel selection problem when a single kernel is used, we present multiple kernel NMF (MKNMF) where two learning problems are jointly solved in unsupervised manner: (1) learning the best convex combination of kernel matrices; (2) learning parameters W and V. We formulate multiple kernel learning in MKNMF as a linear programming and estimate W and V using multiplicative updates as in KNMF. Experiments on benchmark face datasets confirm the high performance of MKNMF over several existing variants of NMF, in the task of feature extraction for face classification.
References (13)
- REFERENCES
- D. D. Lee and H. S. Seung, "Learning the parts of objects by non-negative matrix factorization," Nature, vol. 401, pp. 788- 791, 1999.
- A. Cichocki, H. Lee, Y. -D. Kim, and S. Choi, "Nonnegative matrix factorization with α-divergence," Pattern Recognition Letters, vol. 29, no. 9, pp. 1433-1440, 2008.
- S. Z. Li, X. W. Hou, H. J. Zhang, and Q. S. Cheng, "Learning spatially localized parts-based representation," in Proceedings of the IEEE International Conference on Computer Vision and Pattern Recognition (CVPR), Kauai, Hawaii, 2001, pp. 207- 212.
- Y. Wang, Y. Jia, C. Hu, and M. Turk, "Fisher non-negative ma- trix factorization for learning local features," in Proceedings of the Asian Conference on Computer Vision (ACCV), Jeju Island, Korea, 2004.
- H. Lee, J. Yoo, and S. Choi, "Semi-supervised nonnegative matrix factorization," IEEE Signal Processing Letters, vol. 17, no. 1, pp. 4-7, 2010.
- C. Ding, T. Li, and M. I. Jordan, "Convex and semi- nonnegative matrix factorizations," IEEE Transactions on Pat- tern Analysis and Machine Intelligence, vol. 32, no. 1, pp. 45- 55, 2010.
- H. Lee, A. Cichocki, and S. Choi, "Kernel nonnegative matrix factorization for spectral EEG feature extraction," Neurocom- puting, vol. 72, pp. 3182-3190, 2009.
- G. R. G. Lanchriet, N. Cristianini, P. L. Bartlett, L. E. Ghaoui, and M. I. Jordan, "Learning the kernel matrix with semidefinite programming," Journal of Machine Learning Research, vol. 5, pp. 27-72, 2004.
- F. R. Bach, G. R. G. Lanchriet, and M. I. Jordan, "Multiple kernel learning, conic duality, and the SMO algorithm," in Pro- ceedings of the International Conference on Machine Learning (ICML), Banff, Canada, 2004.
- S. Choi, "Algorithms for orthogonal nonnegative matrix factor- ization," in Proceedings of the International Joint Conference on Neural Networks (IJCNN), Hong Kong, 2008.
- M. Turk and A. Pentland, "Eigenfaces for recognition," Jour- nal of Cognitive Neuroscience, vol. 3, no. 1, pp. 71-86, 1991.
- B. Schölkopf, A. J. Smola, and K. -R. Müller, "Nonlinear com- ponent analysis as a kernel eigenvalue problem," Neural Com- putation, vol. 10, no. 5, pp. 1299-1319, 1998.