Why do scores dip and not always go up across the year?

Although it would seem logical to have the expected percentile scores increase in a linear way throughout the year, with students at all levels of performance showing a smooth, linear growth from fall to spring, students sometimes defy logic. We also know that student growth is not always a linear phenomenon and varies across age and construct. The percentile rank scores on easyCBM are based on empirical results of actual student performance on fall, winter, and spring measures. Interestingly, it is common to see steeper growth from fall to winter, and to see student performance level off or even decrease slightly from winter to spring. Some of the difference in growth may be related to the ‘loss’ associated with summer-time and the jump in level of difficulty as students move from one grade to the next. Thus, students tend to start out scoring lower in the fall, with a fairly steep increase common from fall to winter.

In the spring, given these measures assess a year’s worth of standards, it is common for students to level out. This is reflective of the fact that they mastered all standards at the time of testing. In some grade levels and measures, students may show a decrease from winter to spring. It is uncertain what causes this decrease in score in the spring. However, this phenomenon is consistent with empirical results found in literature across the years. Some educators have theorized that it may be related to student motivation or attention to non-­academic pursuits in the springtime. This is why it is so important to use the percentile ranks rather than simple raw scores when interpreting student performance.

The percentile ranks/national norms are based on actual student performance on these actual tests, and we’ve replicated the results both with two different randomly-drawn samples of 2000 students per grade per measure for the National Norms, as well as numerous more regional analyses. Basically, what we’ve found is that ‘expected performance’ on standardized low-stakes assessments actually does decline as the year goes along for many students, particularly those in middle school.

Although we do not have an empirically-validated reason for why this ‘dampening’ of scores occurs, those of us who have taught middle school in the spring have some theories about students’ focus and willingness to ‘try their best’ when they begin school in the fall versus their willingness to apply themselves and ability to focus in the spring.

This oddity in student performance is actually one of the reasons why we recommend using the percentile ranks, not just the raw scores, when interpreting student performance.

Did this answer your question? Thanks for the feedback There was a problem submitting your feedback. Please try again later.

Still need help? Contact Us Contact Us