Bridging assessment and learning: A cognitive diagnostic analysis of a large-scale Spanish proficiency test

Authors

DOI:

https://doi.org/10.30827/portalin.vi40.15930

Keywords:

Cognitive Diagnostic Assessment, Large-scale Test, Spanish, Individualized Learning

Abstract

There is still relatively little research on cognitive diagnostic analysis of large-scale, high-stakes language tests to assess the individualized strengths and weaknesses of every learner, and even less on the accuracy and applicability of the feedback information in post-hoc learning. Using both cognitive diagnostic results of 1.933 test takers of a national Spanish test and qualitative data from their bachelor thesis drafts, the precision of the diagnostic feedbacks was examined to verify its usefulness for better literature review and discussion. The results showed that this method has an appropriate model fit with the test performance, and is able to determine specific skill profile of each test taker, which is not always consistent with the scores provided by classical test analysis. By triangulating the diagnostic reports and the literature review and discussion of the learners’ thesis drafts, it is clear that the cognitive diagnostic approach in a large-scale test can assess the reading skills more accurately and the feedbacks are valuable for remedying the future academic reading and thesis revision.

Downloads

Download data is not yet available.

Author Biography

Mengmeng Wang, Beijing Foreign Studies University

Facultad de Estudios Hispánicos y Portugueses

References

Abu-Alhija, F.N. (2007). Large-scale Testing: Benefits and pitfalls. Studies in Educational Evaluation, 33: 50-68. https://doi.org/10.1016/j.stueduc.2007.01.005.

Brown, D.H. (2000). Principles of Language Learning and Teaching. New York: Longman.

Buck, G. & Tatsuoka, K.K. (1995). Investigation of the linguistic, cognitive and method attributes underlying test task preference: a pilot analysis using rule space methodology. Paper presented at the Language Testing Research Colloquium, Long Beach, CA.

Buck, G., Tatsuoka, K.K. & Kostin, I. (1997). The Subskills of Reading: Rule-space Analysis of a Multiple-choice Test of Second Language Reading Comprehension. Language Learning, 47, 423–466. https://doi.org/10.1111/0023-8333.00016.

Buck, G. & Tatsuoka, K.K. (1998). Application of the rule-space procedure to language testing: examining attributes of a free response listening test. Language Testing, 15(2), 119-157. https://doi.org/10.1177/026553229801500201.

Chen, H. & Chen, J. (2016). Retrofitting Non-cognitive-diagnostic Reading Assessment Under the Generalized DINA Model Framework. Language Assessment Quarterly, 13(3), 218-230. https://doi.org/10.1080/15434303.2016.1210610.

Chin, H., Chew, C., Lim, H.L. & Thien, L.M. (2021). Development and Validation of a Cognitive Diagnostic Assessment with Ordered Multiple-Choice Items for Addition of Time. International Journal of Science and Mathematics Education, (1), 137-157. https://doi.org/10.1007/s10763-021-10170-5.

De la Torre, J. & Douglas, J.A. (2004). Higher-order latent trait models for cognitive diagnosis. Psychometrika, 69, 333-353. https://link.springer.com/article/10.1007/BF02295640.

De la Torre, J. (2008). An Empirically Based Method of Q-Matrix Validation for the DINA Model: Development and Applications. Journal of Educational Measurement, 45(4), 343-362. https://doi.org/10.1111/j.1745-3984.2008.00069.x.

De la Torre, J., Hong, Y. & Deng, W. (2010). Factors Affecting the Item Parameter Estimation and Classification Accuracy of the DINA Model. Journal of Educational Measurement. 47(2), 227-249. https://doi.org/10.1111/j.1745-3984.2010.00110.x.

De la Torre, J. (2011). The Generalized DINA Model Framework. Psychometrika, 76(2): 179-199. https://link.springer.com/article/10.1007/s11336-011-9207-7.

De la Torre, J. & Minchen, N. (2014). Cognitively Diagnostic Assessments and the Cognitive Diagnosis Model Framework. Psicoglogía Educativa, 20, 89-97. https://doi.org/10.1016/j.pse.2014.11.001.

Haertel, E.H. (1989). Using restricted latent class models to map the skill structure of achievement items. Journal of Educational Measurement, 26(4), 301-323. https://doi.org/10.1111/j.1745-3984.2009.00082.x.

Jang, E.E. (2009). Cognitive Diagnostic Assessment of L2 Reading Comprehension Ability: Validity Arguments for Fusion Model Application to LanguEdge Assessment. Language Testing, 26(1), 31-73. https://doi.org/10.1177/0265532208097336.

Kim, A. (2015). Exploring ways to provide diagnostic feedback with an ESL placement test: Cognitive diagnostic assessment of L2 reading ability. Language Testing, 32(2), 227-258. https://doi.org/10.1177/0265532214558457.

Lee, Y. & Sawaki, Y. (2009). Cognitive Diagnosis Approaches to Language Assessment: An Overview. Language Assessment Quarterly, 6, 172-189. https://doi.org/10.1177/0265532214558457.

Li, H. & Suen, H. K. (2013). Constructing and Validating a Q-matrix for Cognitive Diagnostic Analyses of a Reading Test. Educational Assessment, 18(1), 1-25. https://doi.org/10.1080/10627197.2013.761522.

Li, H., Hunter, V.C. & Lei, P (2016). The Selection of Cognitive Diagnostic Models for a Reading Comprehension Test. Language testing, 33(3), 391-409. https://doi.org/10.1177/0265532215590848.

Min, S.& He, L.(2021). Developing individualized feedback for listening assessment: Combining standard setting and cognitive diagnostic assessment approaches. Language Testing, 38(1), 1-27. https://doi.org/10.1177/0265532221995475.

Mislevy, R.J. (1989). Foundations of a new test theory. Educational Testing Service.

National Advisory Committee for Foreign Language Teaching. (1998). National College Spanish Teaching Syllabus for Spanish Majors. Shanghai: Shanghai Foreign Language Education Press.

Ranjbaran, F. & Alavi, S.M.(2017). Developing a reading comprehension test for cognitive diagnostic assessment: A RUM analysis. Studies In Educational Evaluation, 55, 167-179. https://doi.org/10.1016/j.stueduc.2017.10.007.

Sawaki, Y., Kim, H. & Gentile, C. (2009). Q-matrix Construction: Defining the Link between Constructs and Test Items in Large-scale Reading and Listening Comprehension Assessments. Language Assessment Quarterly, 6, 190–209. https://doi.org/10.1080/15434300902801917.

Tatsuoka, K.K. (1983). Rule-space: An approach for Dealing with Misconceptions Based on Item Response Theory. Journal of Educational Measurement, 20(4), 345-354. https://doi.org/10.1111/j.1745-3984.1983.tb00212.x.

Tuprak, T. E. & Cakir, A. (2021). Examining the L2 Reading Comprehension Ability of Adult ELLs: Developing a Diagnostic Test within the Cognitive Diagnostic Assessment Framework. Language Testing, 38(1): 106-131. https://doi.org/10.1177/0265532220941470.

Von Davier, M. (2005). A General Diagnostic Model Applied to Language Testing Data (ETS Research Rep. No. RR-05-16). Princeton, NJ: Educational Testing Service.

Wang, W. & Qiu, X. (2019). Multilevel Modeling of Cognitive Diagnostic Assessment: The Multilevel DINA Example. Applied Psychological Measurement, 43(1), 34-50. https://doi.org/10.1177/0146621618765713.

Wu, X. Wu, R. Chang, H. Kong, Q. & Zhang, Y. (2020). International Comparative Study on PISA Mathematics Achievement Test Based on Cognitive Diagnostic Models. Frontiers in Psychology. 11,1-13. https://doi.org/10.3389/fpsyg.2020.02230.

Yi, Y. (2017). Probing the Relative Importance of Different Attributes in L2 Reading and Listening Comprehension Items: An Application of Cognitive Diagnostic Model. Language Testing, 34(3), 337-355. https://doi.org/10.1177/0265532216646141.

Zheng, S. & Liu, Y. (2015). A study of Spanish Education in Colleges and Universities in China. Beijing: Foreign Language Teaching and Research Press.

Downloads

Published

2023-06-27

How to Cite

Wang, M. (2023). Bridging assessment and learning: A cognitive diagnostic analysis of a large-scale Spanish proficiency test. Porta Linguarum An International Journal of Foreign Language Teaching and Learning, (40), 9–24. https://doi.org/10.30827/portalin.vi40.15930

Issue

Section

Articles