EXTRACTING FEATURE SUBSPACE FOR KERNEL BASED LINEAR PROGRAMMING SUPPORT VECTOR MACHINES

Search this article

Description

We propose linear programming formulations of support vector machines (SVM). Unlike standard SVMs which use quadratic programs, our approach explores a fairly small dimensional subspace of a feature space to construct the nonlinear discriminator. This allows us to obtain the discriminator by solving a smaller sized linear program. We demonstrate that an orthonormal basis of the subspace can be implicitly treated by eigenvectors of the Gram matrix defined by the associated kernel function. When the number of given data points is very large, we construct a subspace by random sampling of data points. Numerical experiments indicate that the subspace generated by less than 2% of the entire training data points achieves reasonable performance for a fairly large instance with 60000 data points.

Journal

Citations (1)*help

See more

References(45)*help

See more

Related Projects

See more

Details 詳細情報について

Report a problem

Back to top