LEA@NERA: Digitalization and Technologies in Education - Opportunities and Challenges

Several LEA members associated with the national project "Kartleggingsprøver i regning" are presenting at the Nordic Educational Research Association (NERA) 2023 conference in Oslo. Our members contributed to the conference in the form of one article presentation and one poster presentation  (see underneath for more details).

Article presentation

Abstract

Teachers’ assessment responsibilities and tasks change when they move from paper-based assessment formats to digital assessments. For instance, providing instructions to students and scoring and grading the assessments, formerly the teacher’s responsibilities, are tasks often left to the test delivery platform for digital assessments. The purpose of this paper presentation is to discuss what challenges and opportunities the transition from paper-based to digital assessment has for teachers’ assessment literacy (TAL).

TAL can be defined as teachers’ understanding of the principles of sound assessment (Popham, 2004; Stiggins, 2002) and is a critical component of assessment for learning from an equity perspective (Heritage et al., 2009). TAL includes knowledge about tests, how to interpret test results and how to use these results to improve student learning. Moreover, TAL includes being able to adjust instruction and knowing what to teach next based on assessment data. However, teachers may have more difficulty interpreting assessment data and thus find using assessment data formatively to provide student feedback and plan teaching interventions more challenging in a digital format.

This paper presents data from an in-depth interview study involving 14 grades 1 and 3 teachers, focusing on the teachers’ experiences with a national-level formative numeracy assessment. Interviews were conducted on Zoom or at the teacher’s school, audiotaped and later transcribed. Interviews typically lasted 40 to 65 minutes. Thematic analyses were performed.

The analyses revealed that digital assessments put more demand on TAL than paper-based assessments: Teachers likely feel out of control more often and struggle to differentiate between the digital and mathematical demands of the assessment. Moreover, interpreting data likely becomes more challenging when the teachers have not scored the tests themselves, particularly if they do not have a test blueprint. Teachers often look for errors, misconceptions and “holes” rather than signs of what students can do. Consequently, interpreting the data becomes more challenging. Potential consequences for professional development will be discussed.

References:
Heritage, M., Kim, J., Vendlinski, T., & Herman, J. (2009). From evidence to action: A seamless process in formative assessment? Educational Measurement: Issues and Practice, 28(3), 24–31.


Popham, W. J. (2009). Assessment literacy for teachers: Faddish or fundamental? Theory Into Practice, 48(1), 4–11. https://doi.org/10.1080/00405840802577536

Stiggins, R. (2005). From formative assessment to assessment for learning: A path to success in standards-based schools. Phi Delta Kappan, 87(4), 324–328.

Poster presentation

New digital mapping tests for young students – some challenges and solutions. (Oksana Kovpanets, Henrik H. Haram, Karianne Berg Bratting, Guri A. Nortvedt, Andreas Pettersen)

Abstract

In 2019, the Norwegian national authorities decided to develop a new generation of mapping tests for the 1st and 3rd grades. The new tests are digital and were put into use during the spring of 2022.

Our research group explored the opportunities afforded by the use of a digital platform, such as the possibility of including audio-visual support or letting students use manipulatives when they respond. We developed items that were easy to understand, that were adapted to digital ways of responding, and that used the advantages of a digital format to assess the knowledge and skills that students must learn and develop in their early years.

Two main methods were used to collect data on item and student levels to investigate how well items function for young learners: 1) Cognitive labs (Leighton, 2017) were used to study the mental processes a student uses when solving an item. This method gave us valuable insight into pupils’ understanding of instructions and wording in the tasks, strategies students use to solve the tasks, and the technical and motor challenges that arise when young pupils work with digital formats. 2) Large-scale pilot studies, which gave us information whether our items measure the knowledge and skills they are intended to measure. Data on item and student levels were gathered using 2PL IRT (Item Response Theory) and more qualitatively oriented classical analyses. These analyses gave us insights into item difficulty level, discrimination, and test-taker behavior.

The main challenges that our analysis of cognitive labs and test data revealed are fine motor, visual, aesthetic, and technical, challenges caused by too much information on the screen or tests that are too long. For example, young students may not yet have developed fine motor skills and can struggle to move objects on the screen, click on objects, or change and revise their answers. Furthermore, the size, placement, and color of the buttons are important; buttons or answer boxes that are too small can be difficult to select. Many items were therefore developed using multiple formats, and the results for these items in cognitive labs and pre-pilots were compared.

The poster will present some challenges in the development of digital tests for young students that are important to be aware of, and our solutions to those challenges.

References

Leighton J.P. (2017). Using think-aloud interviews and cognitive labs in educational research. Oxford, England: University Press.

 

Published Mar. 8, 2023 4:34 PM - Last modified Mar. 8, 2023 4:34 PM