Publications

Detailed Information

ALBERT with Knowledge Graph Encoder Utilizing Semantic Similarity for Commonsense Question Answering

Cited 0 time in Web of Science Cited 0 time in Scopus
Authors

Byeongmin Choi; YongHyun Lee; Yeunwoong Kyung; Eunchan Kim

Issue Date
2022-09
Publisher
Tech Science Press
Citation
Intelligent Automation & Soft Computing, Vol.36, No.1, pp.71-82
Keywords
Commonsense reasoningquestion answeringknowledge graphlanguage representation model
Abstract
Recently, pre-trained language representation models such as bidirec tional encoder representations from transformers (BERT) have been performing well in commonsense question answering (CSQA). However, there is a problem that the models do not directly use explicit information of knowledge sources existing outside. To augment this, additional methods such as knowledge-aware graph network (KagNet) and multi-hop graph relation network (MHGRN) have been proposed. In this study, we propose to use the latest pre-trained language model a lite bidirectional encoder representations from transformers (ALBERT) with knowledge graph information extraction technique. We also propose to applying the novel method, schema graph expansion to recent language models. Then, we analyze the effect of applying knowledge graph-based knowledge extraction techniques to recent pre-trained language models and confirm that schema graph expansion is effective in some extent. Furthermore, we show that our proposed model can achieve better performance than existing KagNet and MHGRN models in CommonsenseQA dataset.
ISSN
1079-8587
Language
English
URI
https://hdl.handle.net/10371/187033
DOI
https://doi.org/10.32604/iasc.2023.032783
Files in This Item:
Appears in Collections:

Altmetrics

Item View & Download Count

  • mendeley

Items in S-Space are protected by copyright, with all rights reserved, unless otherwise indicated.

Share