Why are the math tests arranged by general topics rather than specific skills?
In easyCBM we don’t have math probes targeting specific math skills because both national and state standards have moved away from skill-based standards (e.g., application, computation, problem-solving), and instead embed those skills across grade-level math domains (e.g., geometry, measurement, etc.).
Here is what our easyCBM recommendation would be:
If the student is functioning close to grade-level expectations (i.e., based on benchmarking and other classroom-based data/information) then the Proficient Math probes would likely be most appropriate. They are longer (25-30 problems) and more difficult than the Basic Math measures, and include applications of math reasoning and multiple problem-solving items on most probes.
If the student is functioning below his/her enrolled grade-level expectations (below 6th grade?), then one of the Basic Math short probes (just 16 test items) is likely more appropriate — at Grade 6, the Algebra probes have the most problem solving/applied reasoning type problems in them.
- If the student is far below grade-level expectations, we suggest working with the teacher/school psychologist, etc. to actually look at the math problems on the different math measures available and then select the measure type that best matches what the student is being taught. By looking at the actual test items, which are similar in style across each of the ten test forms/and within each measure type can match what the student is being taught (generally, but more importantly, during intervention support), to the most appropriate measure(s). If unsure which one is most suitable for a given student, benchmark performance might help identify areas/problem types of weakness.