Where is the item analysis for math?

For math, we've moved away from providing an item level of detail on all of the easy CBM systems because it was actually giving teachers a false impression of reliable specificity. You can still find that information if you need it by going to the technical reports (you can find links to them at: http://www.brtprojects.org/publications/technical-reports), but we actually recommend the teachers not try to look at the very specific standards to which the questions are written but instead look at the general domain, as the specific standards will be less reliable/have more standard error of measure because there is only one item whereas the more general domain will have multiple items aligned to it, and will therefore provide more reliable information.

Integrity is very important to us. We want to be sure that any information we report from the easyCBM assessments is supported by a strong empirical foundation. Early on, we had intended to provide information about the content standard for each math item as part of the Item-Level Reports available on the system. Through repeated analyses of the data, however, we reached the conclusion that providing this information was actually doing a dis-service to teachers, because it was giving them a false impression of student skill / lack of skill, with the potential for misinterpretation and, thus, misguided instructional decisions.

Although the items on the easyCBM math assessments are all written to align with specific content standards, when we examine their functioning in relation to each other and to overall math performance, it becomes clear that they are most reliable when used as indicators of general math knowledge and skill rather than specific knowledge linked to particular content standards. For them to give a reliable indication of student skill/lack of skill on a particular content standard, we would need to increase the number of items measuring each individual standard, with a subsequent increase in test length, and this defeats one of the key goals of progress monitoring — that of having frequent short measures of progress over time toward the whole year’s worth of general math knowledge/skill.

Thus, we updated the reports to provide the more general domains measured by the math items to reduce the likelihood that teachers would focus inappropriately on specific content standards rather than focusing on the more general math construct.

While we do not have an item analysis for math, what we do have is an "item analysis" chart, which shows the items that students missed and includes the domain of each item they missed (see screenshot). You can find these reports under the Reports tab under Groups.

The way the tests are designed is that each domain includes some easy, some moderately difficult, and some difficult items. Although the test forms are of equivalent difficulty across forms as a whole, the difficulty of particular domains across test forms may vary, making the kind of information displayed on the table you've shared problematic. (you may well be basing your decisions on "error" rather than on accurate reflection of students' mastery of particular domains). This is the way that the math tests were designed, as general outcome measures. We do have "mastery monitoring" measures for math in the lower grades (currently K-3 and soon to K-5) through the CBMSkills software that is available free of charge to all easyCBM Deluxe and District account holders. The CBMSkills math measures are designed to be appropriate for measuring student knowledge/ earning at the domain level (and even at more discrete skills).

When you select the Group report, you have access to detailed information about the specific tests your students have taken. The number of tests that have been administered (by type of measure) as well as the average score on each of the measures is indicated in the table. In the event that student data are collected over multiple school years, you may select which data (e.g., last year, this year, or all years) are included in the group reports.

Group Report

To get to the reports, click on the name of a test that your students have taken and then scroll down the page to view the reports.

The bar graph displayed in the Group Report (by measure type under the CBMs section) provides information regarding the heterogeneity of student performance on a specific assessment. If the students’ scores cluster together in a single similar skill grouping, it is likely that teachers can effectively meet students’ instructional needs with whole-class instruction.

When you have one or a few students who score significantly lower (or higher) than their peers, they may need to investigate opportunities to differentiate instruction to better meet their specific skill-based needs. These reports are intended to assist with grouping for instructional interventions.

Group Summary Report

Below the Group Summary Report is the list of students in the group. Here you can click on the View Test link in the column to the right of the student’s name. Clicking on this link will take you to the student’s actual grade test. Here you will be able to see what items a particular student missed and correlate this to what concept a test item is targeting.

Did this answer your question? Thanks for the feedback There was a problem submitting your feedback. Please try again later.

Still need help? Contact Us Contact Us