Publications

Detailed Information

Reliability of paraphrasing scores: Determining appropriate number of items for a paraphrasing test for Korean EFL learners

Cited 0 time in Web of Science Cited 0 time in Scopus
Authors

Minkyung Kim

Issue Date
2022-01-01
Publisher
Department of English Language and Literature, Seoul National University
Keywords
language assessmentpsychometric analysiswriting assessmentspearman-brown prophecy formulaparaphrasing task
Abstract
Kim, Minkyung. 2022. Reliability of paraphrasing scores: Determining appropriate
number of items for a paraphrasing test for Korean EFL learners. SNU Working
Papers in English Language and Linguistics 18, 35-48. Paraphrasing is one of the
major writing skills that has been discussed in academic writing but have not been
tested sufficiently in the context of the second language writing. With great
importance, paraphrasing skill is essential enough to be considered as a new material
for writing assessment. Thus, a paraphrasing test was developed in Kims MA thesis
(2020) with remarkable reliability and validity. In this study, as an expansion of Kims
MA thesis, further investigation is held in terms of complementing the test. Figuring
out the optimal number of the paraphrasing test is the aim of the current study.
Various methods of analysis are employed including Cronbachs alpha and
Spearman-Brown Prophecy Formula. The score reliability of the composite scores
for analytic scoring rubric and holistic total scores in Kim (2020) is .83 and .88.
However, slightly lower values are investigated in each section of scoring rubrics. To
increase the reliability of the paraphrasing test, more items should be involved in the
test rather than five items. The optimal test length for the test is around 20 to reach
the targeted reliability, .90. Furthermore, the exact number of the items for each
rating dimension was investigated. This study would provide some instructional
advice regarding determining the test length to the potential test-designers in the
same field. (Seoul National University)
Language
English
URI
https://hdl.handle.net/10371/176948
Files in This Item:
Appears in Collections:

Altmetrics

Item View & Download Count

  • mendeley

Items in S-Space are protected by copyright, with all rights reserved, unless otherwise indicated.

Share