You are here

A family of large margin linear classifiers and its application in dynamic environments

TitleA family of large margin linear classifiers and its application in dynamic environments
Publication TypeJournal Article
Year of Publication2009
AuthorsShen, J., and T. G. Dietterich
JournalStatistical Analysis and Data Mining
Pagination328 - 345
Date Published12/2009
Keywordsactivity recognition, classification, feature selection, nonstationary environments, online learning

Real-time prediction problems pose a challenge to machine learning algorithms because learning must be fast, the set of classes may be changing, and the relevance of some features to each class may be changing. To learn robust classifiers in such nonstationary environments, it is essential not to assign too much weight to any single feature. We address this problem by combining regularization mechanisms with online large-margin learning algorithms. We prove bounds on their error and show that removing features with small weights has little influence on prediction accuracy, suggesting that these methods exhibit feature selection ability. We show that such regularized learning algorithms automatically decrease the influence of older training instances and focus on the more recent ones. This makes them especially attractive in dynamic environments. We evaluate our algorithms through experimental results on real data sets and through experiments with an online activity recognition system. The results show that these regularized large-margin methods adapt more rapidly to changing distributions and achieve lower overall error rates than state-of-the-art methods.

Short TitleStatistical Analysis Data Mining