Social-event-driven camera control for multicharacter animations

I. Cheng Yeh*, Wen-Chieh Lin, Tong Yee Lee, Hsin Ju Han, Jehee Lee, Manmyung Kim

*Corresponding author for this work

Research output: Contribution to journalArticlepeer-review

7 Scopus citations

Abstract

In a virtual world, a group of virtual characters can interact with each other, and these characters may leave a group to join another. The interaction among individuals and groups often produces interesting events in a sequence of animation. The goal of this paper is to discover social events involving mutual interactions or group activities in multicharacter animations and automatically plan a smooth camera motion to view interesting events suggested by our system or relevant events specified by a user. Inspired by sociology studies, we borrow the knowledge in Proxemics, social force, and social network analysis to model the dynamic relation among social events and the relation among the participants within each event. By analyzing the variation of relation strength among participants and spatiotemporal correlation among events, we discover salient social events in a motion clip and generate an overview video of these events with smooth camera motion using a simulated annealing optimization method. We tested our approach on different motions performed by multiple characters. Our user study shows that our results are preferred in 66.19 percent of the comparisons with those by the camera control approach without event analysis and are comparable (51.79 percent) to professional results by an artist.

Original languageEnglish
Article number6065732
Pages (from-to)1496-1510
Number of pages15
JournalIEEE Transactions on Visualization and Computer Graphics
Volume18
Issue number9
DOIs
StatePublished - 4 Jun 2012

Keywords

  • event analysis
  • MOCAP
  • multicharacter animation
  • social network analysis

Fingerprint

Dive into the research topics of 'Social-event-driven camera control for multicharacter animations'. Together they form a unique fingerprint.

Cite this