A formal analysis of stopping criteria of decomposition methods for support vector machines
Tóm tắt
In a previous paper, the author (2001) proved the convergence of a commonly used decomposition method for support vector machines (SVMs). However, there is no theoretical justification about its stopping criterion, which is based on the gap of the violation of the optimality condition. It is essential to have the gap asymptotically approach zero, so we are sure that existing implementations stop in a finite number of iterations after reaching a specified tolerance. Here, we prove this result and illustrate it by two extensions: /spl nu/-SVM and a multiclass SVM by Crammer and Singer (2001). A further result shows that, in final iterations of the decomposition method, only a particular set of variables are still being modified. This supports the use of the shrinking and caching techniques in some existing implementations. Finally, we prove the asymptotic convergence of a decomposition method for this multiclass SVM. Discussions on the difference between this convergence proof and the one in another paper by Lin are also included.
Từ khóa
#Support vector machines #Convergence #Upper bound #Matrix decomposition #Computer science #KernelTài liệu tham khảo
10.1109/CVPR.1997.609310
platt, 1998, Advances in Kernel Methods— Support Vector Learning
10.1162/089976600300015565
vapnik, 1998, Statistical Learning Theory
10.1162/15324430260185628
10.1007/BF00994018
joachims, 1998, Advances in Kernel Methods— Support Vector Learning
10.1109/72.991427
10.1109/72.963765
10.1023/A:1012431217818
10.1162/089976601750399335
chang, 2001, LIBSVM A library for support vector machines
10.1109/72.977319