Publications

Detailed Information

한국인 영어학습자 발화의 자동음성인식을 위한 영어 음성사전 구성 : Implementation of a non-native pronunciation dictionary for automatic recognition of utterances by Korean learners of English

DC Field Value Language
dc.contributor.author장태엽-
dc.date.accessioned2010-01-12T04:55:49Z-
dc.date.available2010-01-12T04:55:49Z-
dc.date.issued2006-
dc.identifier.citation인문논총, Vol.56, pp. 99-122-
dc.identifier.issn1598-3021-
dc.identifier.urihttps://hdl.handle.net/10371/29669-
dc.description.abstractThis paper introduces a way of enhancing automatic speech recognition
systems in processing non-native speakers utterances. The key procedure is to
design and provide an English pronunciation dictionary which includes frequent
error pronunciations by Korean learners as if they were regular pronunciation
variations. First, knowledge-based rules are collected to generate as many nonnative
variants as possible. Then the relative frequency of each variant is
measured to select more frequent error-variants while discarding rare ones. In
automatic recognition experiments with this dictionary, it is found that the word
recognition performance increases considerably for Korean learners incorrect
pronunciation. Although utterances by Korean learners of English are the main
target data of the current study, the proposed method is expected to be further
available in other language environments.
-
dc.language.isoko-
dc.publisher서울대학교 인문대학 인문학연구원-
dc.subject영어학습자의 발음사전-
dc.subject발음변이-
dc.title한국인 영어학습자 발화의 자동음성인식을 위한 영어 음성사전 구성-
dc.title.alternativeImplementation of a non-native pronunciation dictionary for automatic recognition of utterances by Korean learners of English-
dc.typeSNU Journal-
dc.contributor.AlternativeAuthorJang, Tae-Yeoub-
dc.citation.journaltitle인문논총(Journal of humanities)-
dc.sortNo3-
dc.citation.endpage122-
dc.citation.pages99-122-
dc.citation.startpage99-
dc.citation.volume56-
Appears in Collections:
Files in This Item:

Altmetrics

Item View & Download Count

  • mendeley

Items in S-Space are protected by copyright, with all rights reserved, unless otherwise indicated.

Share