An efficient method for simplifying support vector machines
2005, … of the 22nd international conference on …
https://doi.org/10.1145/1102351.1102429Abstract
In this paper we describe a new method to reduce the complexity of support vector machines by reducing the number of necessary support vectors included in their solutions. The reduction process iteratively selects two nearest support vectors belonging to the same class and replaces them by a newly constructed vector. Through the analysis of relation between vectors in the input and feature spaces, we present the construction of new vectors that requires to find the unique maximum point of a one-variable function on the interval (0, 1), not to minimize a function of many variables with local minimums in former reduced set methods. Experimental results on real life datasets show that the proposed method is effective in reducing number of support vectors and preserving machine's generalization performance.
References (17)
- Boser, B. E., Guyon, I. M., & Vapnik, V. N. (1992). A training algorithm for optimal margin classifiers. Proceedings of the Fifth Annual Workshop on Com- putational Learning Theory (pp. 144-152). Pitts- burgh, Pennsylvania, United States.
- Burges, C. J. C. (1996). Simplified support vector de- cision rules. Proc. 13th International Conference on Machine Learning (pp. 71-77). San Mateo, CA.
- Burges, C. J. C. (1998). A tutorial on support vector machines for pattern recognition. Data Mining and Knowledge Discovery, 2, 121-167.
- Burges, C. J. C., & Schoelkopf, B. (1997). Improving the accuracy and speed of support vector learning machines. In M. Mozer, M. Jordan and T. Petsche (Eds.), Advances in neural information processing systems 9, 375-381. Cambridge, MA: MIT Press.
- Cortes, C., & Vapnik, V. (1995). Support-vector net- works. Machine Learning, 20, 273-297.
- Cristianini, C., & Shawe-Taylor, J. (2000). An in- troduction to support vector machines. Cambridge University Press.
- DeCoste, D., & Mazzoni, D. (2003). Fast query- optimized kernel machine classification via in- cremental approximate nearest support vectors. Proceedings International Conference on Machine Learning (ICML-03) (pp. 115-122).
- Joachims, T. (1998). Text categorization with support vector machines: Learning with many relevant fea- tures. Proceedings of the European Conference on Machine Learning (pp. 137-142). Berlin: Springer.
- Kwok, J., & Tsang, I. (2003). The pre-image prob- lem in kernel methods. In Proceedings of the Twen- tieth International Conference on Machine Learn- ing (ICML-2003) (pp. 408-415). Washington, D.C., USA.
- LeCun, Y., Botou, L., Jackel, L., Drucker, H., Cortes, C., Denker, J., Guyon, I., Muller, U., Sackinger, E., Simard, P., & Vapnik, V. (1995). Learning algo- rithms for classification: A comparison on handwrit- ten digit recognition. Neural Networks, 261-276.
- Liu, C., Nakashima, K., Sako, H., & Fujisawa, H. (2003). Handwritten digit recognition: bench- marking of state-of-the-art techniques. Pattern Recognition, 36, 2271-2285.
- Mika, S., Schoelkopf, B., Smola, A., Muller, K.-R., Scholz, M., & Ratsch, G. (1999). Kernel pca and de-noising in feature spaces. In M. S. Kearns, S. A. Solla and D. A. Cohn (Eds.), Advances in neural information processing systems 11, 536-542. Cam- bridge, MA: MIT Press.
- Osuna, E., Freund, R., & Girosi, F. (1997). Training support vector machines:an application to face de- tection. IEEE Conference on Computer Vision and Pattern Recognition (pp. 130-136). Puerto Rico.
- Press, W. H., Teukolsky, S. A., Vetterling, W. T., & Flannery, B. P. (2002). Numerical recipes in c++ : the art of scientific computing. Cambridge Univer- sity Press.
- Schoelkopf, B., Mika, S., Burges, C. J. C., Knirsch, P., Muller, K., Ratsch, G., & Smola, A. J. (1999). Input space versus feature space in kernel-based methods. IEEE Trans. Neural Networks, 10, 1000-1017.
- Schoelkopf, B., & Smola, A. (2002). Learning with kernels. Cambridge, MA: MIT Press.
- Vapnik, V. (1995). The nature of statistical learning theory. N.Y.: Springer.