After the launch of support vector machines (SVMs), a great deal work has been carried out to make these machines more effective in classification. In our report, we included the preconditioned conjugate gradient method (PCG) with an adaptive constraint reduction method created in 2007 to enhance the performance of training the SVM when working with an Interior-Point Method. We lowered the computational effort in assembling the matrix of normal equations by eliminating needless constraints. By making use of PCG and refactoring the preconditioner only when essential, we also lowered the time to solve the system of normal equations. We also compared 2 techniques to update the preconditioner. Each method consider the 2 latest diagonal matrices in the normal equations. The initial approach selects the indices to be updated depending on the distinction between the diagonal elements while the 2nd approach selects based on the ratio of these elements. Ensuring numerical results for dense matrix problems…
Contents: Preconditioning for Training Support Vector Machines
1 Introduction
2 Support Vector Machine
2.1 Overview
3 Literature Review
4 The Convex Quadratic Program for training the SVM
5 A Primal-Dual Interior-Point Method for solving the CQP
5.1 The Idea Behind IPMs
5.2 Mehrotra’s Predictor-Corrector Algorithm
5.3 Adaptive Constraint Reduction
6 Conjugate Gradient Method and Preconditioning
6.1 Preconditioned Conjugate Gradient Method
6.2 Updating/Downdating Cholesky Preconditioner
7 Numerical Results
8 Conclusion and Future Work
Bibliography
Source: University of Maryland