A new classification method is proposed, called Support Hy- perplanes (SHs). To solve the binary classification task, SHs consider the set of all hyperplanes that do not make classification mistakes, referred to as semi-consistent hyperplanes. A test object is classified using that semi-consistent hyperplane, which is farthest away from it. In this way, a good balance between goodness-of-fit and model complexity is achieved, where model complexity is proxied by the distance between a test object and a semi-consistent hyperplane. This idea of complexity resembles the one imputed in the width of the so-called margin between two classes, which arises in the context of Support Vector Machine learning. Class overlap can be handled via the introduction of kernels and/or slack vari- ables. The performance of SHs against standard classifiers is promising on several widely-used empirical data sets.

,
hdl.handle.net/1765/8012
Econometric Institute Research Papers
Report / Econometric Institute, Erasmus University Rotterdam
Erasmus School of Economics

Nalbantov, G., Bioch, C., & Groenen, P. (2006). Classification with support hyperplanes (No. EI 2006-42). Report / Econometric Institute, Erasmus University Rotterdam. Retrieved from http://hdl.handle.net/1765/8012