Boosted string representation and its application to video surveillance

Jun-Wei Hsieh*, Yung Tai Hsu

*Corresponding author for this work

Research output: Contribution to journalArticlepeer-review

7 Scopus citations


This paper presents a new behavior analysis system for analyzing human movements via a boosted string representation. First of all, we propose a triangulation-based method to transform each action sequence into a set of symbols. Then, an action sequence can be interpreted and analyzed using this string representation. To analyze action sequences with this string representation, three practical problems should be tackled. Usually, an action sequence has different temporal scaling changes, different initial states, and symbol converting errors. Traditional methods (like hidden Markov models and finite state machines) have limited abilities to deal with the above problems since many unknown states should be constructed and initialized. To tackle the problems, a novel string hypothesis generator is then proposed for generating a bank of string features from which different invariant features can be learned for classifying behaviors more accurately. To learn the invariant features, the Adaboost algorithm is used and modified to train a strong classifier from the set of string hypotheses so that multiple human action events can be well classified. In addition, a forward classification scheme is proposed to classify all input action sequences more accurately even though they have various scaling changes and coding errors. Experimental results prove that the proposed method is a robust, accurate, and powerful tool for human movement analysis.

Original languageEnglish
Pages (from-to)3078-3091
Number of pages14
JournalPattern Recognition
Issue number10
StatePublished - 1 Oct 2008


  • Behavior analysis
  • Boosting algorithm
  • Centroid contexts
  • String matching


Dive into the research topics of 'Boosted string representation and its application to video surveillance'. Together they form a unique fingerprint.

Cite this