Incorporating frequent pattern analysis into multimodal HMM event classification for baseball videos

Hsuan Sheng Chen*, W. J. Tsai

*Corresponding author for this work

Research output: Contribution to journalArticlepeer-review

5 Scopus citations

Abstract

Data mining and frequent pattern analysis have recently become a popular way of discovering new knowledge from a data set. However, it is rarely applied to video semantic analysis. Therefore, this paper introduces two methods: frequent-pattern trained HMM and frequent-pattern tailored HMM to incorporate frequent pattern analysis into multimodal HMM event classification for baseball videos. Besides, different symbol coding methods including temporal sequence coding and co-occurrence symbol coding for multimodal HMM classification are compared. The results of our experiments on baseball video event classification demonstrate that integration of frequent pattern analysis could help to improve event classification performances.

Original languageEnglish
Pages (from-to)4913-4932
Number of pages20
JournalMultimedia Tools and Applications
Volume75
Issue number9
DOIs
StatePublished - 1 May 2016

Keywords

  • Baseball event classification
  • Co-occurrence symbol coding
  • Data mining
  • Frequent pattern analysis
  • Frequent-pattern tailored HMM
  • Frequent-pattern trained HMM
  • HMM
  • Interval-based multimodal feature
  • Multimedia system
  • Temporal sequence symbol coding
  • VOGUE
  • Video semantic analysis

Fingerprint

Dive into the research topics of 'Incorporating frequent pattern analysis into multimodal HMM event classification for baseball videos'. Together they form a unique fingerprint.

Cite this