Efficient Algorithms for Exact Inference in Sequence Labeling SVMs
- Authors
- Bauer, Alexander; Goernitz, Nico; Biegler, Franziska; Mueller, Klaus-Robert; Kloft, Marius
- Issue Date
- 5월-2014
- Publisher
- IEEE-INST ELECTRICAL ELECTRONICS ENGINEERS INC
- Keywords
- Dynamic programming; gene finding; hidden Markov SVM; inference; label sequence learning; margin rescaling; slack rescaling; structural support vector machines (SVMs); structured output
- Citation
- IEEE TRANSACTIONS ON NEURAL NETWORKS AND LEARNING SYSTEMS, v.25, no.5, pp.870 - 881
- Indexed
- SCIE
SCOPUS
- Journal Title
- IEEE TRANSACTIONS ON NEURAL NETWORKS AND LEARNING SYSTEMS
- Volume
- 25
- Number
- 5
- Start Page
- 870
- End Page
- 881
- URI
- https://scholar.korea.ac.kr/handle/2021.sw.korea/98592
- DOI
- 10.1109/TNNLS.2013.2281761
- ISSN
- 2162-237X
- Abstract
- The task of structured output prediction deals with learning general functional dependencies between arbitrary input and output spaces. In this context, two loss-sensitive formulations for maximum-margin training have been proposed in the literature, which are referred to as margin and slack rescaling, respectively. The latter is believed to be more accurate and easier to handle. Nevertheless, it is not popular due to the lack of known efficient inference algorithms; therefore, margin rescaling-which requires a similar type of inference as normal structured prediction-is the most often used approach. Focusing on the task of label sequence learning, we here define a general framework that can handle a large class of inference problems based on Hamming-like loss functions and the concept of decomposability for the underlying joint feature map. In particular, we present an efficient generic algorithm that can handle both rescaling approaches and is guaranteed to find an optimal solution in polynomial time.
- Files in This Item
- There are no files associated with this item.
- Appears in
Collections - Graduate School > Department of Artificial Intelligence > 1. Journal Articles
Items in ScholarWorks are protected by copyright, with all rights reserved, unless otherwise indicated.