A formal analysis of stopping criteria of decomposition methods for support vector machines

IEEE Transactions on Neural Networks - Tập 13 Số 5 - Trang 1045-1052 - 2002
Chih-Jen Lin1
1Department of Computer Science and Information Engineering, National Taiwan University, Taipei, Taiwan

Tóm tắt

In a previous paper, the author (2001) proved the convergence of a commonly used decomposition method for support vector machines (SVMs). However, there is no theoretical justification about its stopping criterion, which is based on the gap of the violation of the optimality condition. It is essential to have the gap asymptotically approach zero, so we are sure that existing implementations stop in a finite number of iterations after reaching a specified tolerance. Here, we prove this result and illustrate it by two extensions: /spl nu/-SVM and a multiclass SVM by Crammer and Singer (2001). A further result shows that, in final iterations of the decomposition method, only a particular set of variables are still being modified. This supports the use of the shrinking and caching techniques in some existing implementations. Finally, we prove the asymptotic convergence of a decomposition method for this multiclass SVM. Discussions on the difference between this convergence proof and the one in another paper by Lin are also included.

Từ khóa

#Support vector machines #Convergence #Upper bound #Matrix decomposition #Computer science #Kernel

Tài liệu tham khảo

10.1109/CVPR.1997.609310 platt, 1998, Advances in Kernel Methods&#x2014 Support Vector Learning 10.1162/089976600300015565 vapnik, 1998, Statistical Learning Theory 10.1162/15324430260185628 10.1007/BF00994018 joachims, 1998, Advances in Kernel Methods&#x2014 Support Vector Learning 10.1109/72.991427 10.1109/72.963765 10.1023/A:1012431217818 10.1162/089976601750399335 chang, 2001, LIBSVM A library for support vector machines 10.1109/72.977319