Return to search

Calibrating recurrent sliding window classifiers for sequential supervised learning

Sequential supervised learning problems involve assigning a class label to
each item in a sequence. Examples include part-of-speech tagging and text-to-speech
mapping. A very general-purpose strategy for solving such problems is
to construct a recurrent sliding window (RSW) classifier, which maps some window
of the input sequence plus some number of previously-predicted items into
a prediction for the next item in the sequence. This paper describes a general purpose
implementation of RSW classifiers and discusses the highly practical
issue of how to choose the size of the input window and the number of previous
predictions to incorporate. Experiments on two real-world domains show that
the optimal choices vary from one learning algorithm to another. They also
depend on the evaluation criterion (number of correctly-predicted items versus
number of correctly-predicted whole sequences). We conclude that window
sizes must be chosen by cross-validation. The results have implications for the
choice of window sizes for other models including hidden Markov models and
conditional random fields. / Graduation date: 2004

Identiferoai:union.ndltd.org:ORGSU/oai:ir.library.oregonstate.edu:1957/30043
Date03 October 2003
CreatorsJoshi, Saket Subhash
ContributorsDietterich, Thomas
Source SetsOregon State University
Languageen_US
Detected LanguageEnglish
TypeThesis/Dissertation

Page generated in 1.4483 seconds