A novel online action detection framework from untrimmed video streams
- Authors
- Yoon, Da-Hye; Cho, Nam-Gyu; Lee, Seong-Whan
- Issue Date
- 10월-2020
- Publisher
- ELSEVIER SCI LTD
- Keywords
- Online action detection; Untrimmed video stream; Future frame generation; 3D convolutional neural network; Long short-term memory
- Citation
- PATTERN RECOGNITION, v.106
- Indexed
- SCIE
SCOPUS
- Journal Title
- PATTERN RECOGNITION
- Volume
- 106
- URI
- https://scholar.korea.ac.kr/handle/2021.sw.korea/53037
- DOI
- 10.1016/j.patcog.2020.107396
- ISSN
- 0031-3203
- Abstract
- Online temporal action localization from an untrimmed video stream is a challenging problem in computer vision. It is challenging because of i) in an untrimmed video stream, more than one action instance may appear, including background scenes, and ii) in online settings, only past and current information is available. Therefore, temporal priors, such as the average action duration of training data, which have been exploited by previous action detection methods, are not suitable for this task because of the high intra-class variation in human actions. We propose a novel online action detection framework that considers actions as a set of temporally ordered subclasses and leverages a future frame generation network to cope with the limited information issue associated with the problem outlined above. Additionally, we augment our data by varying the lengths of videos to allow the proposed method to learn about the high intra-class variation in human actions. We evaluate our method using two benchmark datasets, THUMOS'14 and ActivityNet, for an online temporal action localization scenario and demonstrate that the performance is comparable to state-of-the-art methods that have been proposed for offline settings. (C) 2020 Elsevier Ltd. All rights reserved.
- Files in This Item
- There are no files associated with this item.
- Appears in
Collections - Graduate School > Department of Artificial Intelligence > 1. Journal Articles
Items in ScholarWorks are protected by copyright, with all rights reserved, unless otherwise indicated.