Detailed Information

Cited 0 time in webofscience Cited 0 time in scopus
Metadata Downloads

Comparative Analysis of Current Approaches to Quality Estimation for Neural Machine Translation

Authors
Eo, SugyeongPark, ChanjunMoon, HyeonseokSeo, JaehyungLim, Heuiseok
Issue Date
7월-2021
Publisher
MDPI
Keywords
quality estimation; neural machine translation; pretrained language model; multilingual pre-trained language model; WMT
Citation
APPLIED SCIENCES-BASEL, v.11, no.14
Indexed
SCIE
SCOPUS
Journal Title
APPLIED SCIENCES-BASEL
Volume
11
Number
14
URI
https://scholar.korea.ac.kr/handle/2021.sw.korea/137244
DOI
10.3390/app11146584
ISSN
2076-3417
Abstract
Quality estimation (QE) has recently gained increasing interest as it can predict the quality of machine translation results without a reference translation. QE is an annual shared task at the Conference on Machine Translation (WMT), and most recent studies have applied the multilingual pretrained language model (mPLM) to address this task. Recent studies have focused on the performance improvement of this task using data augmentation with finetuning based on a large-scale mPLM. In this study, we eliminate the effects of data augmentation and conduct a pure performance comparison between various mPLMs. Separate from the recent performance-driven QE research involved in competitions addressing a shared task, we utilize the comparison for sub-tasks from WMT20 and identify an optimal mPLM. Moreover, we demonstrate QE using the multilingual BART model, which has not yet been utilized, and conduct comparative experiments and analyses with cross-lingual language models (XLMs), multilingual BERT, and XLM-RoBERTa.
Files in This Item
There are no files associated with this item.
Appears in
Collections
Graduate School > Department of Computer Science and Engineering > 1. Journal Articles

qrcode

Items in ScholarWorks are protected by copyright, with all rights reserved, unless otherwise indicated.

Altmetrics

Total Views & Downloads

BROWSE