Estimation of distortion sensitivity for visual quality prediction using a convolutional neural network
- Authors
- Bosse, Sebastian; Becker, Soeren; Mueller, Klaus-Robert; Samek, Wojciech; Wiegand, Thomas
- Issue Date
- 8월-2019
- Publisher
- ACADEMIC PRESS INC ELSEVIER SCIENCE
- Keywords
- Deep learning; Distortion sensitivity; Image quality assessment; Perceptual coding; Visual perception
- Citation
- DIGITAL SIGNAL PROCESSING, v.91, pp.54 - 65
- Indexed
- SCIE
SCOPUS
- Journal Title
- DIGITAL SIGNAL PROCESSING
- Volume
- 91
- Start Page
- 54
- End Page
- 65
- URI
- https://scholar.korea.ac.kr/handle/2021.sw.korea/63599
- DOI
- 10.1016/j.dsp.2018.12.005
- ISSN
- 1051-2004
- Abstract
- The PSNR and MSE are the computationally simplest and thus most widely used measures for image quality, although they correlate only poorly with perceived visual quality. More accurate quality models that rely on processing on both the reference and distorted image are potentially difficult to integrate in time-critical communication systems where computational complexity is disadvantageous. This paper derives the concept of distortion sensitivity as a property of the reference image that compensates for a given computational quality model a potential lack of perceptual relevance. This compensation method is applied to the PSNR and leads to a local weighting scheme for the MSE. Local weights are estimated by a deep convolutional neural network and used to improve the PSNR in a computationally graceful distribution of computationally complex processing to the reference image only. The performance of the proposed estimation approach is evaluated on LIVE, TID2013 and CSIQ databases and shows comparable or superior performance compared to benchmark image quality measures. (C) 2018 The Author(s). Published by Elsevier Inc.
- Files in This Item
- There are no files associated with this item.
- Appears in
Collections - Graduate School > Department of Artificial Intelligence > 1. Journal Articles
Items in ScholarWorks are protected by copyright, with all rights reserved, unless otherwise indicated.