감시 비디오에서의 휴먼-휴먼 상호 행동 인식을 위한 확률 모델링 프레임워크A Novel Probabilistic Modeling Framework for Person-to-Person Interaction Recognition in Video Surveillance
- Other Titles
- A Novel Probabilistic Modeling Framework for Person-to-Person Interaction Recognition in Video Surveillance
- Authors
- 석흥일; 이성환
- Issue Date
- 2011
- Publisher
- 한국정보과학회
- Keywords
- 휴먼-휴먼 상호 행동 인식; 동적 베이지안 네트워크; 비디오 서베일런스; Person-to-Person Interaction Recognition; Dynamic Bayesian Network; Video Surveillance
- Citation
- 정보과학회논문지 : 소프트웨어 및 응용, v.38, no.11, pp.613 - 625
- Indexed
- KCI
- Journal Title
- 정보과학회논문지 : 소프트웨어 및 응용
- Volume
- 38
- Number
- 11
- Start Page
- 613
- End Page
- 625
- URI
- https://scholar.korea.ac.kr/handle/2021.sw.korea/113662
- ISSN
- 1229-6848
- Abstract
- In this paper, we propose a novel probabilistic modeling framework for automatic analysis and understanding of human interactions in visual surveillance tasks. Our principal assumption is that an interaction episode is composed of meaningful small unit interactions, which we call ‘sub-interactions.’ We model each sub-interaction by a dynamic probabilistic model using spatio-temporal characteristics and propose a Modified Factorial Hidden Markov Model (MFHMM) with factored observations. The complete interaction is represented with a network of Dynamic Probabilistic Models (DPMs) by an ordered concatenation of sub-interaction models. The rationale for this approach is that it is more effective in utilizing common components, i.e., sub-interaction models, to describe complex interaction patterns. We demonstrate the feasibility and effectiveness of the proposed method by analyzing the structure of network of DPMs and its success on two different databases: a self-collected dataset and Tsinghua University’s dataset.
- Files in This Item
- There are no files associated with this item.
- Appears in
Collections - Graduate School > Department of Artificial Intelligence > 1. Journal Articles
Items in ScholarWorks are protected by copyright, with all rights reserved, unless otherwise indicated.