Optimizing Educational Assessment: The Practicality of Computer Adaptive Testing (CAT) with an Item Response Theory (IRT) Approach

Asrul Huda - Universitas Negeri Padang, 25131, Indonesia
Firdaus Firdaus - Universitas Negeri Padang, 25131, Indonesia
Dedy Irfan - Universitas Negeri Padang, 25131, Indonesia
Yeka Hendriyani - Universitas Negeri Padang, 25131, Indonesia
Almasri Almasri - Universitas Negeri Padang, 25131, Indonesia
Murni Sukmawati - Universitas Negeri Padang, 25131, Indonesia


Citation Format:



DOI: http://dx.doi.org/10.62527/joiv.8.1.2217

Abstract


This research aims to develop a Computer Adaptive Test (CAT) system using the Items Response Theory (IRT) approach. This study is part of developing a web-based system using the Research and Development (R&D) method, employing the Four-D (4-D) model. At its core, this system is similar to a Computer-Based Test (CBT). Still, the critical difference lies in its ability to randomize and provide questions that align with the test-taker's skill levels using the Items Response Theory (IRT) algorithm. The system employs the 3-PL model from the Items Response Theory, considering the difficulty level of questions, the discriminative power of questions, and the likelihood of guessing or interference in the questions. The examination system randomly assigns questions to students based on their responses to previous questions, ensuring that each test-taker receives a unique question sequence. The exam concludes when a test-taker accurately estimates their ability, i.e., SE <= 0.01, or when all questions have been answered. The outcome of this research is a Computer Adaptive Test (CAT) system based on the Items Response Theory (IRT), which can be used to assess students' learning outcomes. This research was implemented in the Multimedia Department of SMK Negeri 1 Gunung Talang, with 90 students as the research sample. The evaluation of the practicality of this system received very high scores, indicating that the Computer Adaptive Test (CAT) system based on the Items Response Theory (IRT) is considered highly practical and effective in achieving the established measurement goals.


Keywords


Computer Adaptive Test, Items Response Theory, Research and Development, Four-D Model, Educational Assessment

Full Text:

PDF

References


H. Goss, “Student learning outcomes assessment in higher education and academic libraries: A review of the literature,” J. Acad. Librariansh., vol. 48, no. 2, p. 102485, 2022.

H. Y. Suhendi, M. A. Ramdhani, and F. S. Irwansyah, “Verification concept of assesment for physics education student learning outcome,” Int. J. Eng. Technol. UEA, vol. 7, no. 3.21, pp. 321–325, 2018.

U. Levent and Ş. Ertok, “Student Opinions on task-based approach as formative evaluation versus exam-based approach as summative evaluation in education,” Sak. Univ. J. Educ., vol. 10, no. 2, pp. 226–250, 2020.

R. Sun, H. Zhang, J. Li, J. Zhao, and P. Dong, “Assessment-for-learning teaching mode based on interactive teaching approach in college English,” Int. J. Emerg. Technol. Learn. Online, vol. 15, no. 21, p. 24, 2020.

C. K. Chan and L. Y. Luk, “Academics’ beliefs towards holistic competency development and assessment: A case study in engineering education,” Stud. Educ. Eval., vol. 72, p. 101102, 2022.

M. Nurtanto, N. Kholifah, A. Masek, P. Sudira, and A. Samsudin, “Crucial Problems in Arranged the Lesson Plan of Vocational Teacher.,” Int. J. Eval. Res. Educ., vol. 10, no. 1, pp. 345–354, 2021.

I. W. Mertha and M. Mahfud, “History Learning Based on Wordwall Applications to Improve Student Learning Results Class X IPS in Ma As’adiyah Ketapang,” Int. J. Educ. Rev. Law Soc. Sci. IJERLAS, vol. 2, no. 5, pp. 507–612, 2022.

K. A. Gamage, D. I. Wijesuriya, S. Y. Ekanayake, A. E. Rennie, C. G. Lambert, and N. Gunawardhana, “Online delivery of teaching and laboratory practices: Continuity of university programmes during COVID-19 pandemic,” Educ. Sci., vol. 10, no. 10, p. 291, 2020.

L. A. Clark and D. Watson, “Constructing validity: New developments in creating objective measuring instruments.,” Psychol. Assess., vol. 31, no. 12, p. 1412, 2019.

S. J. Eloranta, R. Kaltiala, N. Lindberg, M. Kaivosoja, and K. Peltonen, “Validating measurement tools for mentalization, emotion regulation difficulties and identity diffusion among Finnish adolescents,” Nord. Psychol., vol. 74, no. 1, pp. 30–52, 2022.

A. Alam, “Employing adaptive learning and intelligent tutoring robots for virtual classrooms and smart campuses: reforming education in the age of artificial intelligence,” in Advanced Computing and Intelligent Technologies: Proceedings of ICACIT 2022, Springer, 2022, pp. 395–406.

A. S. Rahmatullah, E. Mulyasa, S. Syahrani, F. Pongpalilu, and R. E. Putri, “Digital era 4.0: The contribution to education and student psychology,” Linguist. Cult. Rev., vol. 6, no. S3, pp. 89–107, 2022.

S. Vincent-Lancrin and R. Van der Vlies, “Trustworthy artificial intelligence (AI) in education: Promises and challenges,” 2020.

R. Oktrilani, V. I. Delianti, B. R. Fajri, and A. D. Samala, “Rancang Bangun Media Pembelajaran Berbasis Augmented Reality pada Materi Sistem Pernapasan Mata Pelajaran Biologi Kelas XI MIPA Tingkat SMA,” J. Vokasi Inform. JAVIT, pp. 79–86, Jun. 2023, doi: 10.24036/javit.v3i2.156.

R. Husna and S. Husna, “Pembelajaran Seluler dan Efek Motivasi dalam Hal Kinerja Guru,” JAVIT J. Vokasi Inform., pp. 95–101, Jun. 2023, doi: 10.24036/javit.v3i2.114.

S. Mystakidis, A. Christopoulos, and N. Pellas, “A systematic mapping review of augmented reality applications to support STEM learning in higher education,” Educ. Inf. Technol., vol. 27, no. 2, pp. 1883–1927, 2022.

R. Martínez-Jiménez and M. C. Ruiz-Jiménez, “Improving students’ satisfaction and learning performance using flipped classroom,” Int. J. Manag. Educ., vol. 18, no. 3, p. 100422, 2020.

J. Singh, K. Steele, and L. Singh, “Combining the best of online and face-to-face learning: Hybrid and blended learning approach for COVID-19, post vaccine, & post-pandemic world,” J. Educ. Technol. Syst., vol. 50, no. 2, pp. 140–171, 2021.

L. Clouder, M. Karakus, A. Cinotti, M. V. Ferreyra, G. A. Fierros, and P. Rojo, “Neurodiversity in higher education: a narrative synthesis,” High. Educ., vol. 80, no. 4, pp. 757–778, 2020.

H. Gardner, Disciplined mind: What all students should understand. Simon & Schuster, 2021.

P. Putra, F. Y. Liriwati, T. Tahrim, S. Syafrudin, and A. Aslan, “The students learning from home experience during covid-19 school closures policy in indonesia,” J. Iqra, vol. 5, no. 2, 2020.

J. G. Nguyen, K. J. Keuseman, and J. J. Humston, “Minimize online cheating for online assessments during COVID-19 pandemic,” J. Chem. Educ., vol. 97, no. 9, pp. 3429–3435, 2020.

H.-C. Lin, G.-J. Hwang, S.-C. Chang, and Y.-D. Hsu, “Facilitating critical thinking in decision making-based professional training: An online interactive peer-review approach in a flipped learning context,” Comput. Educ., vol. 173, p. 104266, 2021.

S.-C. Cheng, Y.-P. Cheng, and Y.-M. Huang, “To implement computerized adaptive testing by automatically adjusting item difficulty index on adaptive English learning platform,” J. Internet Technol., vol. 22, no. 7, pp. 1599–1607, 2021.

B. Yessingeldinov, Z. Rakhymbayeva, A. Zhapparova, and L. Tursynova, “Utilising a differentiated assessment method in mathematics class: computer adaptive testing for tracking student progress,” Glob. J. Eng. Educ., vol. 25, no. 1, 2023.

M. Boussakuk, A. Bouchboua, M. El Ghazi, M. El Bekkali, and M. Fattah, “Design of Computerized Adaptive Testing Module into Our Dynamic Adaptive Hypermedia System,” Int. J. Emerg. Technol. Learn. IJET, vol. 16, no. 18, pp. 113–128, 2021.

K. Pliakos, S.-H. Joo, J. Y. Park, F. Cornillie, C. Vens, and W. Van den Noortgate, “Integrating machine learning into item response theory for addressing the cold start problem in adaptive learning systems,” Comput. Educ., vol. 137, pp. 91–103, 2019.

P.-C. Bürkner, “Bayesian item response modeling in R with brms and Stan,” ArXiv Prepr. ArXiv190509501, 2019.

I. Lee, S. Grover, F. Martin, S. Pillai, and J. Malyn-Smith, “Computational thinking from a disciplinary perspective: Integrating computational thinking in K-12 science, technology, engineering, and mathematics education,” J. Sci. Educ. Technol., vol. 29, pp. 1–8, 2020.

M. O. Ogunjimi, M. A. Ayanwale, J. I. Oladele, D. S. Daramola, I. M. Jimoh, and H. O. Owolabi, “Simulated evidence of computer adaptive test length: Implications for high stakes assessment in Nigeria,” J. High. Educ. Theory Pract., vol. 21, no. 2, pp. 202–212, 2021.

M. A. Sorrel, J. R. Barrada, J. de la Torre, and F. J. Abad, “Adapting cognitive diagnosis computerized adaptive testing item selection rules to traditional item response theory,” Plos One, vol. 15, no. 1, p. e0227196, 2020.

D. Sugiyono. Metode penelitian pendidikan pendekatan kuantitatif, kualitatif dan R&D. Bandung, Indonesia: Alfabeta, 2013.