easyCBM tests are largely norm-references fundamental skills assessments. Thus, a student’s performance is compared to national norms––scores based on a large nationally-representative group of same-grade students on the same assessment. Our norm-referenced scores are reported in raw totals, but also, in percentiles (1-100). So, for these assessments, think of baby growth charts, where children’s weight and height are compared to one another. Just like those charts, a student’s percentile rank for norm-referenced assessments compares their performance to other grade-level peers.
In the Detailed Percentiles Tables (Detailed Norms), under each measure the raw score is displayed along with the percentile rank associated with that raw score. Because percentile ranks are 1 thru 100, sometimes a student does not need to get all items correct on a given measure to be ranked at that 100th percentile above all their grade-level peers in the norming sample (think of percentile ranks for SAT/ACT/GRE sections, which are similar).
For example, if a student gets 19 of 20 items on the Grade 7 MCRC Winter benchmark they are at the 98th percentile, and thus getting all 20 correct puts them at 100 (page 59 in the tables). Slightly different, is a student scores 18 of 20 on the Grade 7 MCRC Spring benchmark, they are already at the 99th percentile (page 59). Here, we do not show the last item or two because it is assumed at the 100th percentile for scoring 20 or 19 on the Winter and Spring benchmarks, respectively.
Users should see this pattern across other assessments/measures as well. For example, also on page 59, the Fall PRF (oral fluency) measure shows many raw scores (words correct per minute; wcpm) associated with a percentile ranks of 0, 1, 2…and so on. Moving to page 60, if a student reads 231 or more wcpm they are at the 99th percentile or higher, even though there are more words on the measure that 231, we stop there in the Tables because it becomes a bit redundant and the high percentile ranks are assumed.