Rethinking WIDA's ACCESS Test Validity and Reliability: Bibliometric and Content Analysis (2005-2025)

Authors

  • Dr. Housseine Bachiri Bilingual & Multicultural Department, Lincoln Center-Illinois, Waukegan, Illinois, United States.

DOI:

https://doi.org/10.55737/rl.2025.44132

Keywords:

ACCESS Test, Validity, Reliability, Psychometrics, Rasch Model, Bibliometric Analysis, Content Analysis, English Language Proficiency

Abstract

This study endeavors to investigate the validity and reliability of WIDA’s ACCESS test since 2005 to present time, with a particular emphasis on its potency to accurately measure English language proficiency for English learners (ELs). Using a combination of bibliometric and content analysis methods, the research synthesizes insights from more than 30 scholarly and empirical studies, highlighting trends, methodological approaches, and key findings pertinent to ACCESS. In addition, the study analyzes ACCESS test data gathered across 14 Illinois school districts, during the 2022–2024 academic years in order to provide a current and data-driven perspective on student performance patterns. Moreover, the analysis centers on the Rasch Model, which underpins the ACCESS scoring system, providing a framework for measuring proficiency while also considering its inherent limitations. The findings disclose a complex interplay between systemic factors, such as the bilingual program structure (90:10 versus 50:50), instructional core, including teaching strategies and classroom supports, all of which collectively shape ACCESS test outcomes and interpretations. Similarly, the findings indicate that more longitudinal empirical research should be done so as to examine and evaluate the validity and reliability of ACCESS test, while simultaneously proposing a new practical testing philosophy that would theoretically and empirically benefit from the previously existing English language proficiency tests, and hence lead to more valid and reliable testing methods.

Author Biography

  • Dr. Housseine Bachiri, Bilingual & Multicultural Department, Lincoln Center-Illinois, Waukegan, Illinois, United States.

    Corresponding Author: [email protected]

References

Abedi, J., & Lord, C. (2001). The language factor in mathematics tests. Applied Measurement in Education, 14(3), 219-234. https://doi.org/10.1207/s15324818ame1403_2

Al Fraidan, A. (2025). Test anxiety across writing formats: The mediating role of perceived teacher strictness. Acta Psychologica, 256(104942), 104942. https://doi.org/10.1016/j.actpsy.2025.104942

Avdiu, V., & Ahmedi, V. (2024). Alternative Assessment Strategies to Enhance Learning for Students with Special Needs. Journal of Social Studies Education Research, 15(5), 1–25. https://jsser.org/index.php/jsser/article/view/6024

Aydin, S., Denkci Akkaş, F., Türnük, T., Baştürk Beydilli, A., & Saydam, İ. (2020). Test anxiety among foreign language learners: A qualitative study. The Qualitative Report. https://doi.org/10.46743/2160-3715/2020.4686

Aydınlar, A., Mavi, A., Kütükçü, E., Kırımlı, E. E., Alış, D., Akın, A., & Altıntaş, L. (2024). Awareness and level of digital literacy among students receiving health-based education. BMC Medical Education, 24(1). https://doi.org/10.1186/s12909-024-05025-w

Bachiri, H. (2025). Teaching for Biliteracy in the United States: Pitfalls and Recommendations. ProScholar Insights, 4(3), 78-95. https://doi.org/10.55737/psi.2025c-43106

Beeman, K. & Urow, C. (2013). Teaching for Biliteracy: Strengthening Bridges Between Languages. Caslon Publishing.

Bennett, R. E. (2023). Toward a theory of socio-culturally responsive assessment. Language Testing, 40(3), 389–411. https://doi.org/10.1080/10627197.2023.2202312

Berliner, D. C., & Nichols, S. L. (2007). High-stakes testing is putting the nation at risk. Education Week, 26(27), 36-48.

Berman, A. I., Haertel, E. H., & Pellegrino, J. W. (2020). Comparability of Large-Scale Educational Assessments: Issues and Recommendations. National Academy of Education.

Bond TG. and Fox CM. (2007). Applying the Rasch model: Fundamental measurement in the human sciences. Second ed. New York: Routledge. https://doi.org/10.4324/9781315814698

Boone, W. J., & Scantlebury, K. (2006). The role of Rasch analysis when conducting science education research utilizing multiple-choice tests. Science Education, 90(2), 253–269. https://doi.org/10.1002/sce.20106

Brookhart, S. M. (2012). SAGE handbook of research on classroom assessment. SAGE Publications.

Celestino, M. S., Belluzzo, R. C. B., Albino, J. P., & Valente, V. C. P. N. (2024). Bibliometric analysis: Literature review and proposal of a methodological framework in 12 steps. Revista Aracê, 6(4), 13421–13446. https://doi.org/10.56238/arev6n4-146

Cohen, M. (2022). 8 Reasons WIDA ACCESS Student Scores May Be Invalid. Owlcation.

Cross, R. (2025). The impact of assessment on teaching and learning: Positive washback in classroom contexts. Oxford University Press.

Deng, Y., & Liu, H. (2025). To overcome test anxiety in on-line assessment: unpacking the mediator roles of techno competencies, teacher support, self-efficacy, and autonomy. BMC Psychology, 13(1), 192. https://doi.org/10.1186/s40359-025-02545-y

Downing, V. R. (2020). Fear of negative evaluation and student anxiety in community college students. CBE-Life Sciences Education, 19(4).

Echevarria, J., Vogt, M., & Short, D. (2017). Making content comprehensible for English The SIOP model (5th ed.). Pearson.

Espinosa, L. (2010, March 2). Assessment for Young ELLs: Strengths and Limitations in Current Practices. Colorín Colorado; Colorín Colorado. https://www.colorincolorado.org/article/assessment-young-ells-strengths-and-limitations-current-practices

Foster, S. M. (2024). High-stakes Standardized Testing: Its Disproportionate Impact on Marginalized Communities.

Fox, J., & Fairbairn, S. (2011). Test review: ACCESS for ELLs®. Language Testing, 28(3), 425–431. https://doi.org/10.1177/0265532211404195

Frederiksen, N. (1984). The real test bias: Influences of testing on teaching and learning. American Psychologist, 39(3), 193–202. https://doi.org/10.1037/0003-066X.39.3.193

Gao, L., Zhang, Y., & Chen, J. (2025). Automatic assessment of text-based responses in post-secondary education: A systematic review. Computers & Education, 168, 104211. https://doi.org/10.1016/j.caeai.2024.100206

Goldstein, H. (1979). Consequences of using the Rasch model for educational assessment. British Educational Research Journal, 5(2), 211–220. https://doi.org/10.1080/0141192790050207

Guo, S., Wang, Y., Yu, J., Wu, X., Ayik, B., Watts, F. M., Latif, E., Liu, N., & Liu, L. (2025). Artificial intelligence bias on English language learners in automatic scoring. arXiv. https://doi.org/10.48550/arXiv.2505.10643

Hope, D., Kluth, D., Homer, M., Dewar, A., Goddard-Fuller, R., Jaap, A., & Cameron, H. (2024). Exploring the use of Rasch modelling in “common content” items for multi-site and multi-year assessment. Advances in Health Sciences Education, 30(2), 427-438. https://doi.org/10.1007/s10459-024-10354-y

Jalilzadeh, K., & Coombe, C. (2023). Constraints in employing learning-oriented assessment in EFL classrooms: Teachers’ perceptions. Language Testing in Asia, 13(1). https://doi.org/10.1186/s40468-023-00222-8

Jonsson, A., & Svingby, G. (2007). The use of scoring rubrics: Reliability, validity, and educational consequences. Educational Research Review, 2(2), 130–144.

Liu, H. (2025). The mediating role of foreign language anxiety. PMC. https://pmc.ncbi.nlm.nih.gov/articles/PMC12383800/

Liu, Y., & Boone, W. J. (2006). Applying Rasch measurement in classroom assessment. Journal of Science Education and Technology, 15(3), 257–274.

Ly, H. H. (2024). A Review of Alternative Assessment Methods and How to Apply Them in EFL Classrooms. Journal of Research in Humanities and Social Science, 12(5), 176-181.

McNeal, N. L. (2016). Correlating English language learner CRCT scores on the basis of English language learner access scores. Liberty University.

Panayides, P. (2010). The assessment revolution that has passed England by: Rasch measurement. British Educational Research Journal, 5(2), 211–220. https://doi.org/10.1080/01411920903018182

Patterson, E., & Schneider, E. (2024). Are WIDA test results appropriately reflecting multilingual learners’ language skills according to ESOL teachers’ experiences?: Results of a pilot study. GATESOL Journal, 33(1), 3–14. https://doi.org/10.52242/gatesol.184

Proficiency Test Series 1200–2021–2022 Administration. Madison, WI: WIDA Consortium.

Rasch, G. (1960). Probabilistic model for some intelligence and achievement tests. Copenhagen: Danish Institute for Educational Research.

Salazar, M. C. (2022). An Analysis of Variables Impacting English Learner Achievement on Science Assessments. Sustainability, 14(13), 7814. https://www.mdpi.com/2071-1050/14/13/7814

Shohamy, E. (2001). The power of tests: A critical perspective on the uses of language tests. Longman

Smith, T. E. (2022). Self-management interventions for reducing challenging classroom behaviors. Campbell Systematic Reviews, 18(1), e1234. https://doi.org/10.1002/cl2.1223

Solano-Flores, G., & Trumbull, E. (2003). Examining language in context: The need for new research and practice paradigms in the testing of English-language learners. Educational Researcher (Washington, D.C.: 1972), 32(2), 3–13. https://doi.org/10.3102/0013189x032002003

State Superintendent of Education. (2020). 2019-2020 Accessibility and Accommodations Supplement.

Tan, L. Y., McLean, S., Kim, Y. A., & Vitta, J. P. (2024). Rasch modelling vs. item facility: implications on the validity of assessments of Asian EFL/ESL vocabulary knowledge and lexical sophistication modelling. Language Testing in Asia, 14(1). https://doi.org/10.1186/s40468-024-00327-8

Villegas, L. (2022, October 19). Long-standing Limitations of English Learner Academic Assessment Data Persist. New America. https://www.newamerica.org/education-policy/edcentral/long-standing-limitations-of-english-learner-academic-assessment-data-persist/

Waters, C. N. (2020). Teachers' perceptions of the broad validity of a high stakes English language proficiency test [Dissertation, Virginia Commonwealth University]. Scholars Compass.

Waters, C. N. (2021). Considerations for Using ACCESS Test Scores in Decision Making. Virginia Commonwealth University Scholars Compass.

WIDA Consortium. (2011). Annual Technical Report for ACCESS for ELLs® English Language Proficiency Test Series 100–2004–2005 Administration. Madison, WI: WIDA Consortium.

WIDA Consortium. (2013). Annual Technical Report for ACCESS for ELLs® English Language Proficiency Test Series 400–2012–2013 Administration. Madison, WI: WIDA Consortium.

WIDA Consortium. (2014). Annual Technical Report for ACCESS for ELLs® English Language Proficiency Test Series 500–2014–2015 Administration. Madison, WI: WIDA Consortium.

WIDA Consortium. (2015). Annual Technical Report for ACCESS for ELLs® English Language Proficiency Test Series 600–2015–2016 Administration. Madison, WI: WIDA Consortium.

WIDA Consortium. (2016). Annual Technical Report for ACCESS for ELLs® English Language Proficiency Test Series 700–2016–2017 Administration. Madison, WI: WIDA Consortium.

WIDA Consortium. (2017). Annual Technical Report for ACCESS for ELLs® English Language Proficiency Test Series 800–2017–2018 Administration. Madison, WI: WIDA Consortium.

WIDA Consortium. (2018). Annual Technical Report for ACCESS for ELLs® English Language Proficiency Test Series 900–2018–2019 Administration. Madison, WI: WIDA Consortium.

WIDA Consortium. (2019). Annual Technical Report for ACCESS for ELLs® English Language Proficiency Test Series 1000–2019–2020 Administration. Madison, WI: WIDA Consortium.

WIDA Consortium. (2020). ACCESS for ELLs 2.0: Test administration manual. WIDA.

WIDA Consortium. (2020). Annual Technical Report for ACCESS for ELLs® English Language Proficiency Test Series 1100–2020–2021 Administration. Madison, WI: WIDA Consortium.

WIDA Consortium. (2021). Annual Technical Report for ACCESS for ELLs® English Language

WIDA Consortium. (2022). Annual Technical Report for ACCESS for ELLs® English Language Proficiency Test Series 1300–2022–2023 Administration. Madison, WI: WIDA Consortium.

WIDA Consortium. (2023). Annual Technical Report for ACCESS for ELLs® English Language Proficiency Test Series 1400–2023–2024 Administration. Madison, WI: WIDA Consortium.

WIDA Consortium. (2024). Annual Technical Report for ACCESS for ELLs® English Language Proficiency Test Series 1500–2024–2025 Administration. Madison, WI: WIDA Consortium.

WIDA Consortium. (2025). Annual Technical Report for WIDA Alternate ACCESS for ELLs. Madison, WI: WIDA Consortium.

WIDA. (2019). Investigating K–12 English learners’ use of universal tools embedded in online language assessments.

WIDA. (2020). WIDA Accessibility and Accommodations Framework. https://wida.wisc.edu/assess/accessibility-accommodations

WIDA. (2025). ACCESS for ELLs Interpretive Guide for Score Reports Grades K-12. Board of Regents of the University of Wisconsin System.

WIDA. 2019-2020 Accessibility and Accommodations Supplement.

Wolf, M. K. (2008). Validity Issues in Assessing English Language Learners' Language Proficiency. Language Testing, 25(3), 355–377. https://doi.org/10.1080/10627190802394222

Wolf, M. K. (2010). Improving the Validity of English Language Learner Assessments. Educational Policy, 24(4), 523–545. https://www.tandfonline.com

Zheng, Y., & Cheng, L. (2018). How does anxiety influence language performance? From the perspectives of foreign language classroom anxiety and cognitive test anxiety. Language Testing in Asia, 8(1). https://doi.org/10.1186/s40468-018-0065-4

Downloads

Published

2025-12-13

Issue

Section

Articles

How to Cite

Bachiri, H. (2025). Rethinking WIDA’s ACCESS Test Validity and Reliability: Bibliometric and Content Analysis (2005-2025). Regional Lens, 4(4), 91-113. https://doi.org/10.55737/rl.2025.44132