Engineering of Intelligent Systems
2001, Lecture Notes in Computer Science
https://doi.org/10.1007/3-540-45517-5Abstract
This paper describes a machine learning method, called Regression by Selecting Best Feature Projections (RSBFP). In the training phase, RSBFP projects the training data on each feature dimension and aims to find the predictive power of each feature attribute by constructing simple linear regression lines, one per each continuous feature and number of categories per each categorical feature. Because, although the predictive power of a continuous feature is constant, it varies for each distinct value of categorical features. Then the simple linear regression lines are sorted according to their predictive power. In the querying phase of learning, the best linear regression line and thus the best feature projection are selected to make predictions.
References (15)
- Breiman, L, Friedman, J H, Olshen, R A and Stone, C J 'Classification and Regression Trees' Wadsworth, Belmont, California (1984)
- Friedman, J H 'Local Learning Based on Recursive Covering' Department of Statistics, Stanford University (1996)
- Weiss, S and Indurkhya, N ' Rule-based Machine Learning Methods for Functional Prediction' Journal of Artificial Intelligence Research Vol 3 (1995) pp 383-403
- Aha, D, Kibler, D and Albert, M 'Instance-based Learning Algorithms' Machine Learning Vol 6 (1991) pp 37 -66
- Quinlan, J R 'Learning with Continuous Classes' Proceedings AI'92 Adams and Sterling (Eds) Singapore (1992) pp 343-348
- Bratko, I and Karalic A 'First Order Regression' Machine Learning Vol 26 (1997) pp 147- 176
- Karalic, A 'Employing Linear Regression in Regression Tree Leaves' Proceedings of ECAI'92 Vienna, Austria, Bernd Newmann (Ed.) (1992) pp 440-441
- Friedman, J H 'Multivariate Adaptive Regression Splines' The Annals of Statistics Vol 19 No 1 (1991) pp 1-141
- Breiman, L 'Stacked Regressions' Machine Learning Vol 24 (1996) pp 49-64
- Kibler, D, Aha D W and Albert, M K 'Instance-based Prediction of Real-valued Attributes' Comput. Intell. Vol 5 (1989) pp 51-57
- Weiss, S and Indurkhya, N 'Optimized Rule Induction' IEEE Expert Vol 8 No 6 (1993) pp 61-69
- Graybill, F, Iyer, H and Burdick, R 'Applied Statistics' Upper Saddle River, NJ (1998)
- AydÕQ, T 'Regression by Selecting Best Feature(s)' M.S.Thesis, Computer Engineering, Bilkent University, September, (2000)
- $\GÕQ 7 DQG Güvenir, H A 'Regression by Selecting Appropriate Features' Proceedings of TAINN'2000, Izmir, June 21-23, (2000), pp 73-82
- 8\VDO ø DQG *YHQLU + $ 'Regression on Feature Projections' Knowledge-Based Systems, Vol.13, No:4, (2000), pp 207-214