Boosted Mixture of Experts: An Ensemble Learning Scheme

Neural Computation - Tập 11 Số 2 - Trang 483-497 - 1999
Ran Avnimelech1, Nathan Intrator1
1Department of Computer Science, Sackler faculty of Exact Sciences, Tel Aviv University, Tel-Aviv, Israel

Tóm tắt

We present a new supervised learning procedure for ensemble machines, in which outputs of predictors, trained on different distributions, are combined by a dynamic classifier combination model. This procedure may be viewed as either a version of mixture of experts (Jacobs, Jordan, Nowlan, & Hinton, 1991), applied to classification, or a variant of the boosting algorithm (Schapire, 1990). As a variant of the mixture of experts, it can be made appropriate for general classification and regression problems by initializing the partition of the data set to different experts in a boostlike manner. If viewed as a variant of the boosting algorithm, its main gain is the use of a dynamic combination model for the outputs of the networks. Results are demonstrated on a synthetic example and a digit recognition task from the NIST database and compared with classical ensemble approaches.

Từ khóa


Tài liệu tham khảo

10.1007/BF00058655

10.1007/BF00117832

10.1162/neco.1994.6.6.1289

10.1109/34.58871

10.1109/34.273716

10.1162/neco.1991.3.1.79

10.1162/neco.1994.6.2.181

Raviv Y., 1996, Connection Science (Special Issue), 8, 356

10.1007/BF00116037

10.1016/S0893-6080(05)80023-1