MS-SVM: Minimally Spanned Support Vector Machine

Article Type

Research Article

Publication Title

Applied Soft Computing Journal

Abstract

For a Support Vector Machine (SVM) algorithm, the time required for classifying an unknown data point is proportional to the number of support vectors. For some real time applications, use of SVM could be a problem if the number of support vectors is high. Depending on the complexity of the class structure, sometimes the number of support vectors of a SVM model increases with the number of training data points. Here our objective is to reduce the number of support vectors, yet, maintaining more of less the same level of accuracy as that of a normal SVM that does not use any reduction of support vectors. An SVM finds a separating hyperplane maximizing the margin of separation and hence, the location of the hyperplane is primarily dependent on a set of “boundary points”. Here, we first identify some boundary points using a minimum spanning tree on the training data to obtain a reduced training set. The SVM algorithm is then applied on the reduced training data to generate the classification model. We call this algorithm, Minimally Spanned Support Vector Machine (MS-SVM). We also assess the performance by relaxing the definition of boundary points. Moreover we extend the algorithm to a feature space using a kernel transformation. In this case, an MST is generated in the feature space using the associated kernel matrix. Our experimental results demonstrate that the proposed algorithm can considerably reduce the number of support vectors without affecting the overall classification accuracy. This is true irrespective of whether the MST is generated in the input space or in the feature space. Thus the MS-SVM algorithm can be used instead of SVM for efficient classification.

First Page

356

Last Page

365

DOI

10.1016/j.asoc.2017.12.017

Publication Date

3-1-2018

This document is currently not available here.

Share

COinS