Detailed Information

Cited 0 time in webofscience Cited 0 time in scopus
Metadata Downloads

Embedding Calculus with Nonword Properties Improves Word Sense DisambiguationEmbedding Calculus with Nonword Properties Improves Word Sense Disambiguation

Other Titles
Embedding Calculus with Nonword Properties Improves Word Sense Disambiguation
Authors
김성태송상헌
Issue Date
2021
Publisher
한국언어학회
Keywords
Word Sense Disambiguation; Word Embedding; BERT; Embedding Calculus; Probing Task; U-WIN; Lexical Hierarchy
Citation
언어, v.46, no.2, pp.259 - 292
Indexed
KCI
Journal Title
언어
Volume
46
Number
2
Start Page
259
End Page
292
URI
https://scholar.korea.ac.kr/handle/2021.sw.korea/129707
DOI
10.18855/lisoko.2021.46.2.002
ISSN
1229-4039
Abstract
The present study concerns word sense disambiguation in neural language models using the diagnostic classifiers and hierarchical lexical network. First, we conducted an experiment to see whether the neural models are capable of detecting ambiguous nouns and how they do so. Secondly, we carried out an experiment to verify whether the neural models can identify a specific sense of a lexeme and how they do so. For these experiments, we made use of Word2Vec and FastText as the fixed embedding models and BERT as the contextualized model. In addition, we examined the uniformed and weighted sum method by adding nonword properties (senses). In the case of ambiguity detection, BERT with the general embedding showed better performance than the other models. In regards to sense class detection, BERT with nonword properties showed the best performance on lexemes with numerous senses.
Files in This Item
There are no files associated with this item.
Appears in
Collections
College of Liberal Arts > Department of Linguistics > 1. Journal Articles

qrcode

Items in ScholarWorks are protected by copyright, with all rights reserved, unless otherwise indicated.

Altmetrics

Total Views & Downloads

BROWSE