COMPARISON OF PARAMETER LOGISTIC MODELS OF RAJU DIFFERENTIAL ITEM FUNCTIONING ITEM RESPONSE BASED

Main Article Content

Daniel Oyeniran

Abstract

In educational and psychological assessment, item bias might result in an unfair advantage or disadvantage for some group. As a result, the statistical strategies for detecting DIF in the Raju model have been proposed. Most of these techniques are intended for comparing pre-defined focal and reference groups, such as males and females.


A total of 480 students attempted 30 mathematics items from each of the five areas that have continuously been identified by West African Examination Council (WAEC) chief examiners for the past ten years as challenging for the student to acquire appreciable results from the items related to them. The Raju method of DIF was utilized, and because it is based on IRT, the 1PL, 2PL, and 3PL models were compared to determine which items bias male and female samples. The results demonstrate that when 1PL is employed, two items have DIF, however when 2PL and 3PL are utilized, 26 items are biased.


As a result, it's clear that those topics aren't as difficult as they appear, but students aren't scoring well because of the bias in those items. Without considering the influence of DIF, it is reasonable to assume that only around half of the students performed well on these tasks. It is not because the pupils do not know the answer, but rather because the items are biased, making it difficult for them to perfect their response efficiently.


 

Article Details

How to Cite
Oyeniran, D. (2024). COMPARISON OF PARAMETER LOGISTIC MODELS OF RAJU DIFFERENTIAL ITEM FUNCTIONING ITEM RESPONSE BASED. The African Journal of Behavioural and Scale Development Research, 5(1). https://doi.org/10.58579/AJB-SDR/5.1.2023.70
Section
Articles

References

Ayanwale, M. A. 2017. Efficacy of Item Response Theory in score ranking and concurrent validity of dichotomous and polytomous response mathematics achievement test in Osun state, Nigeria. Unpublished Ph.D. Thesis. Institute of Education. The University of Ibadan.

Ayva-Yörü, F. G. and Atar, H. Y. 2019. Determination of differential item functioning (DIF) according to SIBTEST, Lord’s 2, Raju’s area measurement and, Breslow-Day methods. Journal of Pedagogical Research, 3(3), 139-150.

Camilli, G., and Shepard, L. A. 1994. Methods for identifying biased test items. Newbury Park, CA: Sage.

Chen, W. H., and Thissen, D. (1997). Local dependence indexes for item pairs using item response theory. Journal of Educational and Behavioral Statistics, 22(3), 265-289.

Choi, S. W. (2010). Differential item functioning analysis. In S. E. Embretson & S. L. Hershberger (Eds.), The Wiley handbook of psychometric testing: A multidisciplinary reference on survey, scale and test development (pp. 709-735). Wiley.

Crocker, L., and Algina J. 1986. Introduction to classical and modern test theory. Orlando: Harcourt Brace Jovanovich Inc.

de Ayala, R. J. (2009). The theory and practice of item response theory. Guilford Press.

Embretson, S. E., and Reise, S. P. (2000). Item response theory for psychologists. Lawrence Erlbaum Associates.

Embretson, S. E., and Reise, S. P. (2013). Item response theory for psychologists. Psychology Press.

Hambleton, R. K., Swaminathan, H., and Rogers, H. J. (1991). Fundamentals of item response theory. Newbury Park, CA: Sage

Jackman, W. M., and Morrain-Webb, J. (2019). Exploring gender differences in Achievement through student's voice: Critical insight and analyses, Cogent Education, 6:1, https://doi.org/10/1080/2331186X.2019.1567895

Karami, H. (2012). An introduction to differential item functioning. The International Journal of Educational and Psychological Assessment, 11(2), 59-76.

Magis, D., Beland, S., Teurlinckx, F., and Boeck, P. (2010). A general framework and an R package for the detection of dichotomous differential item functioning. Behavior Research Methods, 42(3), 847-862.

Okoro, O. 2006. Principles and Methods in Vocational and Technical Education. Nsukka: University Trust Publishers

Osterlind, S. J., and Everson, H. T. (2009). Differential item functioning analysis with 2- and 3-parameter logistic models: DIFdetect and difwithpar. Applied Measurement in Education, 22(4), 331-347. https://doi.org/10.1080/08957340903245435

Raju, N. S. (1988). The area between two item characteristic curves. Psychometrika, 53(4), 495-502.

Raju, N. S., and Arenson, E. 2002. Developing a common metric in item response theory: An area-minimization approach. Paper presented at the annual meeting of the National Council on Measurement in Education, New Orleans, LA.

Raju, N. S., van der Linden, W. J., and Fleer, P. F. (1995). IRT-based internal measures of differential functioning of items and tests. In K. F. Geisinger (Ed.), Psychological testing of Hispanics (pp. 271-297). American Psychological Association.

Revelle W (2021). psych: Procedures for Psychological, Psychometric, and Personality Research. Northwestern University, Evanston, Illinois. R package version 2.1.6, https://CRAN.R-project.org/package=psych

RStudio Team (2020). RStudio: Integrated Development for R. RStudio, PBC, Boston, MA URL http://www.rstudio.com/.

Schumacker, R. 2005. Test bias and differential item functioning retrieved on September 2021 from http://www.appliedmeasurementassociates.com/WhitePapers/TEST-Bias-and-Differential-Item-Functioning.pdf

Wang, H., Cao, Y., Huang, H., and Chen, J. (2018). Investigating differential item functioning on a Chinese mathematics test: Using the Raju method to estimate DIF impact. International Journal of Testing, 18(3), 262-277. https://doi.org/10.1080/15305058.2018.1448134

Workman, J., and Heyder, A. 2020. Gender achievement gaps: the role of social costs to trying hard in high school. Soc Psychol Educ, 23, 1407-1427. https://doi.org/10/1007/s11218-020-095886