Blogs

Back to School 2020: An Unprecedented Time for Assessment

By Megan Welsh posted 11-30-2020 10:18 PM

  

Laurie Laughlin Davis & Stephanie Lawkins, Curriculum Associates

 

Curriculum Associates serves approximately 30% of US students in grades K-8 with reading and math curriculum and assessments designed to inform instruction. Data collected so far this school year offer insights about how students are doing in light of changes to schooling since the pandemic began – and raise questions about at-home testing. The i-Ready Diagnostic is an online, criterion-referenced computer adaptive assessment that is administered three times a year, beginning in the Fall, to inform instructional decisions and instructional pathways.

 

Early this summer, we felt confident as a company that most students across the country would be back in school for at least part of the week and were advising our district partners to administer the Diagnostic assessment when students were on site. By the end of July, though, conditions started rapidly shifting towards large districts – Los Angeles, Atlanta, Chicago, etc. – going all remote to start the school year.  We knew then that we needed to quickly figure out what it looked like to assess well when students were at home.

 

This was a more daunting challenge than it might seem. Some of our districts had tested at home in the Spring and the results were not particularly promising. The biggest issue we saw in Spring 2020 data was inexplicable score inflation, in particular at K-2. There was also wide variability in data consistency at the district, school and even class level. This wasn’t an issue specific to the i-Ready Diagnostic: In CCSSO (Council of Chief State School Officers) working group sessions with other large national interim assessment organizations, all participants indicated a challenge with consistently capturing accurate student assessment data in remote spring 2020 testing.

 

The Research and Psychometrics teams dug into online instructional data and the test results from 874 schools, who administered the Diagnostic assessment for Reading and 800 schools who administered the Diagnostic assessment for Mathematics at home in Spring 2020, to identify schools and even teachers who got data consistent with in-school implementations. Then, working with our Educator Success team, we interviewed principals and teachers who got good data to understand how they did so. (See the full research report. Information specific to the Diagnostic assessment begins on page 6.)

 

One of the most promising findings from these interviews was that educators can put systems in place to drive data consistency during remote testing, including in schools that serve high poverty, high minority communities. Our Educator Success team, in collaboration with teams that directly serve districts and schools, moved quickly to create and disseminate Assessing At Home kits that translated best practices around multi-tiered communications, remote proctoring, and family/caregiver engagement into actionable steps for educators.  

 

Moving on to this Fall, as of October 1, slightly more than 65% of Diagnostic assessments administered in K-8 were taken in a location other than school. Now that the data is starting to come back, we’re seeing some very positive trends in terms of administration fidelity.

  • Across all grades, 66% of students who took their Reading Diagnostic outside of school had highly consistent test administration conditions with those that we would expect to see in school.

  • Across all grades, 73% of students who took their Math Diagnostic outside of school had highly consistent test administration conditions with those that we would expect to see in school.

  • In a descriptive analysis, the median score for students in grades 3-8 for math and in grades 4-8 for reading who had high data consistency and tested at home was similar to their peers who tested in school.

 

We defined “highly consistent” through four measures: Did the student take “too long” for the test (defined as 125% of recommended test duration for the grade level)? Did the student switch devices during the test (i.e., from laptop to iPad)? Did students take the test across more sittings than would seem “reasonable” (defined as fewer than 6 sessions/sittings)? Did the student have an unexpected response pattern (as defined by the outfit statistic)? Curiously enough, when we screen on these four variables, students who test out of school show strikingly similar results to students who test in school.

 

When we look at students who have tested in school (where testing conditions mirror those of pre-COVID circumstances), we do see evidence of a “COVID slide” but not as dire as many predicted, with grades 2, 3, and 4 hardest hit and more regression in Math than in Reading. In contrast, when we look at the data from students testing out of school, we see scores that are higher than historical averages, especially in the earlier grades. More research is needed on what is causing this trend. See a summary of our findings.

 

We continue to educate our district partners about why they are seeing these trends and are working with them to move forward past assessment into instruction, which was the point of testing at home in the first place: to inform instructional decisions. We will continue to monitor our data throughout the year, and we hope that we can use these data to help inform the broader field of assessment and measurement about COVID learning loss and at-home testing.

 

0 comments
12 views

Permalink