Publications

Detailed Information

Distance-based Directional Self-Attention Network : 거리 및 방향 기반 Self-Attention Network

Cited 0 time in Web of Science Cited 0 time in Scopus
Authors

임진배

Advisor
조성준
Major
공과대학 산업공학과
Issue Date
2018-02
Publisher
서울대학교 대학원
Keywords
Attention mechanismDistance between wordsDistance maskLocal dependencyGlobal dependencyNatural Language Inference
Description
학위논문 (석사)-- 서울대학교 대학원 : 공과대학 산업공학과, 2018. 2. 조성준.
Abstract
Attention mechanism has been used as an ancillary means to help RNN or CNN. However, the Transformer (Vaswani et al., 2017) recently recorded the state-of-the-art performance in machine translation with a dramatic reduction in training time by solely using attention. Motivated by the Transformer, Directional Self Attention Network (Shen et al., 2017), a fully attention-based sentence encoder, was proposed. It showed good performance with various data by using forward and backward directional information in a sentence. But in their study, not considered at all was the distance between words, an important feature when learning the local dependency to help understand the context of input text. We propose Distance-based Directional Self-Attention Network, which considers the word distance by using a simple distance mask in order to model the local dependency without losing the ability of modeling global dependency which attention has inherent. Our model shows good performance with NLI data, and it records the new state-of-the-art result with SNLI data. Additionally, we show that our model has a strength in long sentences or documents.
Language
English
URI
https://hdl.handle.net/10371/141437
Files in This Item:
Appears in Collections:

Altmetrics

Item View & Download Count

  • mendeley

Items in S-Space are protected by copyright, with all rights reserved, unless otherwise indicated.

Share