COMPARATIVE STUDY ON RELIABILITY BETWEEN ONLINE AND PAPER-BASED VERSIONS OF A TEST FOR READING IN ENGLISH AS A FOREIGN LANGUAGE

Authors

  • Anne C. Ihata Musashino University, Tokyo, Japan

DOI:

https://doi.org/10.20319/pijss.2019.53.215223

Keywords:

EFL, Reading Comprehension, Assessment, Internet-/Paper-Based Tests

Abstract

The aim of this study is to examine how reliable the results of the internet-based version of a reading test are, with a view to replacing paper tests with the online versions. This is of increasing importance as universities focus on improving efficiency and supporting SDGs by going paperless. The study was also suggested by the need to test reading comprehension of larger numbers of students across the university and deliver meaningful results on which to base intensive programs of instruction quickly. The Extensive Reading Foundation’s online reading test and the (now discontinued) Edinburgh Project for Extensive Reading’s placement test (paper-based), were administered to university students under controlled conditions, and the data was analyzed for possible relationships. An initial one-way ANOVA analysis of the results suggested little evidence of a relationship between online and paper-based test scores. However, further analysis using other measures found evidence of interaction between them, and a second ANOVA analysis, only of scores for students who had completed all versions of the test found a significant relationship. Familiarity with both versions of the test was considered as a possible factor. Although this is only a small-scale study, the findings help to support the argument for adopting the online version of the test, with its various potential benefits to schools and educators.

References

Azmuddin, R., Ali, Z., Ngah, E., Mohd Tamili, L. & Mohd Ruslim, N. (2014). Extensive Reading Using Graded Readers. International Journal of Research in Social Sciences 3(8): 109-113.

Backes, B. & Cowan, J. (2018, April). Is the Pen Mightier Than the Keyboard? The Effect of Online Testing on Measured Student Achievement. CALDER Working Paper No. 190. National Center for Analysis of Longitudinal Data in Educational Research (American Institutes for Research). Retrieved from https://caldercenter.org/sites/default/files/ WP%20190.pdf?platform=hootsuite

Boevé, A. J., Meijer, R. R., Albers, C. J., Beetsma, Y. & Bosker, R. J. (2015). Introducing Computer-Based Testing in High-Stakes Exams in Higher Education: Results of a Field Experiment. PloS one, 10(12), e0143616. https://doi.org/10.1371/journal.pone.0143616

Candrlic, S., Asenbrener Katic, M. & Holenko Dlab, M. (2014). Online vs. Paper-Based Testing: A Comparison of Test Results. Proceedings of the 37th International Convention on Information and Communication Technology, Electronics and Microelectronics (MIPRO), 775-780. https://doi.org/10.1109/MIPRO.2014.6859649

Edinburgh Project on Extensive Reading: Placement Tests A & B. (1994). Institute for Applied Languages Studies, University of Edinburgh.

Extensive Reading Foundation Placement Test. (n.d.). Retrieved from https://erfpt.ealps.shinshu-u.ac.jp/

Graham, S. (2016). Here’s How the Method of Testing Can Change Student Scores. The Conversation. Retrieved from https://theconversation.com/heres-how-the-method-of-testing-can-change-student-scores-54992

Herbert, H. (2016, March 16). Extensive Reading Level Placement: Determining Japanese College Students’Appropriate Starting Levels. Language and Culture: The Journal of the Institute for Language and Culture of Konan University, 20:143-156. doi 10.14990/00001722

Herold, B. (2016, February 3). PARCC Scores Lower for Students Who Took Exams on Computers. Education Week, 35 (20): 1, 11. (Print version published February 10, 2016). Retrieved from https://www.edweek.org/ew/articles/2016/02/03/parcc-scores-lower-on-computer.html?cmp=SOC-SHR-TW

Jüngling, S., Telesko, R., & Reber, A. (2018). Checking the Student’s Aptitude for a Bachelor Program: Experiences with a Web-based Tool. PUPIL: International Journal of Teaching, Education and Learning, 2(2). Retrieved from https://grdspublishing.org/ index.php/PUPIL/article/view/1505 https://doi.org/10.20319/pijtel.2018.22.149169

Lam, J. (2018). The Pedagogy-driven, Learner-centred, Objective-oriented and Technology-enabled (PLOT) Online Learning Model. PUPIL: International Journal of Teaching, Education and Learning, 2(2). Retrieved from https://grdspublishing.org/index.php/ PUPIL/article/view/1432 https://doi.org/10.20319/pijtel.2018.22.6680

SPSS Version 23[Software]. (1989, 2015). IBM Corp.

The Sustainable Development Agenda. (.n.d.). Retrieved from https://www.un.org/sustainable development/development-agenda/

Walker, C. (1997). A Self Access Extensive Reading Project Using Graded Readers (with particular reference to students of English for academic purposes). Reading in a Foreign Language 11 (1): 121-149. Retrieved from https://nflrc.hawaii.edu/rfl/PastIssues/ rfl111walker.pdf

White, S., Kim, Y. Y., Chen, J., & Liu, F. (2015). Performance of Fourth-grade Students in the 2012 NAEP Computer-based Writing Pilot Assessment: Scores, Text Length, and Use of Editing Tools (No. NCES 2015-119). Washington, DC: National Center for Education Statistics, Institute of Education Sciences, U.S. Department of Education. Retrieved from https://nces.ed.gov/nationsreportcard/subject/writing/pdf/2015119.pdf

Yoshizawa, K. (2014, March). Edinburgh Project on Extensive Reading (EPER) Reading Comprehension Tests: Scoring and Setting Cutoff Scores. Kansai Daigaku Gaikokugo Gakubu Kiyo, 10: 33-43. Retrieved from https://www.kansai-u.ac.jp/fl/publication/pdf_ department/10/02yoshizawa.pdf

Zwier, L. J. (2012). Inside Reading 2 (2nd Ed.). Oxford University Press.

Downloads

Published

2019-12-06

How to Cite

Ihata, A. C. (2019). COMPARATIVE STUDY ON RELIABILITY BETWEEN ONLINE AND PAPER-BASED VERSIONS OF A TEST FOR READING IN ENGLISH AS A FOREIGN LANGUAGE. PEOPLE: International Journal of Social Sciences, 5(3), 215–223. https://doi.org/10.20319/pijss.2019.53.215223