K-EPIC: Entity-Perceived Context Representation in Korean Relation Extraction
DC Field | Value | Language |
---|---|---|
dc.contributor.author | Hur, Yuna | - |
dc.contributor.author | Son, Suhyune | - |
dc.contributor.author | Shim, Midan | - |
dc.contributor.author | Lim, Jungwoo | - |
dc.contributor.author | Lim, Heuiseok | - |
dc.date.accessioned | 2022-02-13T20:40:50Z | - |
dc.date.available | 2022-02-13T20:40:50Z | - |
dc.date.created | 2022-01-19 | - |
dc.date.issued | 2021-12 | - |
dc.identifier.issn | 2076-3417 | - |
dc.identifier.uri | https://scholar.korea.ac.kr/handle/2021.sw.korea/135662 | - |
dc.description.abstract | Relation Extraction (RE) aims to predict the correct relation between two entities from the given sentence. To obtain the proper relation in Relation Extraction (RE), it is significant to comprehend the precise meaning of the two entities as well as the context of the sentence. In contrast to the RE research in English, Korean-based RE studies focusing on the entities and preserving Korean linguistic properties rarely exist. Therefore, we propose K-EPIC (Entity-Perceived Context representation in Korean) to ensure enhanced capability for understanding the meaning of entities along with considering linguistic characteristics in Korean. We present the experimental results on the BERT-Ko-RE and KLUE-RE datasets with four different types of K-EPIC methods, utilizing entity position tokens. To compare the ability of understanding entities and context of Korean pre-trained language models, we analyze HanBERT, KLUE-BERT, KoBERT, KorBERT, KoELECTRA, and multilingual-BERT (mBERT). The experimental results demonstrate that the F1 score increases significantly with our K-EPIC and that the performance of the language models trained with the Korean corpus outperforms the baseline. | - |
dc.language | English | - |
dc.language.iso | en | - |
dc.publisher | MDPI | - |
dc.title | K-EPIC: Entity-Perceived Context Representation in Korean Relation Extraction | - |
dc.type | Article | - |
dc.contributor.affiliatedAuthor | Lim, Heuiseok | - |
dc.identifier.doi | 10.3390/app112311472 | - |
dc.identifier.scopusid | 2-s2.0-85120806056 | - |
dc.identifier.wosid | 000735064400001 | - |
dc.identifier.bibliographicCitation | APPLIED SCIENCES-BASEL, v.11, no.23 | - |
dc.relation.isPartOf | APPLIED SCIENCES-BASEL | - |
dc.citation.title | APPLIED SCIENCES-BASEL | - |
dc.citation.volume | 11 | - |
dc.citation.number | 23 | - |
dc.type.rims | ART | - |
dc.type.docType | Article | - |
dc.description.journalClass | 1 | - |
dc.description.journalRegisteredClass | scie | - |
dc.description.journalRegisteredClass | scopus | - |
dc.relation.journalResearchArea | Chemistry | - |
dc.relation.journalResearchArea | Engineering | - |
dc.relation.journalResearchArea | Materials Science | - |
dc.relation.journalResearchArea | Physics | - |
dc.relation.journalWebOfScienceCategory | Chemistry, Multidisciplinary | - |
dc.relation.journalWebOfScienceCategory | Engineering, Multidisciplinary | - |
dc.relation.journalWebOfScienceCategory | Materials Science, Multidisciplinary | - |
dc.relation.journalWebOfScienceCategory | Physics, Applied | - |
dc.subject.keywordAuthor | Korean pre-trained language model | - |
dc.subject.keywordAuthor | deep learning | - |
dc.subject.keywordAuthor | information extraction | - |
dc.subject.keywordAuthor | relation extraction | - |
Items in ScholarWorks are protected by copyright, with all rights reserved, unless otherwise indicated.
(02841) 서울특별시 성북구 안암로 14502-3290-1114
COPYRIGHT © 2021 Korea University. All Rights Reserved.
Certain data included herein are derived from the © Web of Science of Clarivate Analytics. All rights reserved.
You may not copy or re-distribute this material in whole or in part without the prior written consent of Clarivate Analytics.