Lexile scores and determining the readability of a passage
We have not Lexiled the passages used on easyCBM Lite and Deluxe. Our District partners at Riverside Insights, however, have paid for a converting feature that will provide their district account users with the opportunity to do that. To make inquiries, please visit Riverside Insights for more information about purchasing the District Edition.
For the Lite and Deluxe versions, however, keep in mind that measures of reading comprehension involve a complex relationship between text decoding (no doubt influenced by passage difficulty), test question demands (e.g., whether students are responding to literal or inferential question types), and the student's mental representation of the story. So, while estimates of passage difficulty (such as a Lexile score) is one source of information, it is not the only information to consider when evaluating whether to use a particular measure of reading comprehension for progress monitoring.
There are quite a number of different ways of determining the 'readability' of a particular passage. Instead of Lexiles we use a three-step process to ensure that the passages are appropriate for students in the middle of the year at that grade level. A detailed description of this process can be found in the technical reports which can be accessed by clicking on the "About" tab located on the website and clicking on the link to the technical reports. All of the research and methodology is included in detail and you will be able to select the measures/grades you are most interested in.
Because of the challenge of using a strict 'readability index' or 'grade level' software alone, we have adopted a three step process of ensuring that our passages are appropriate for the grades at which they are intended.
- During their development, we make sure that each paragraph is within the proper range (3.4 - 3.6, for instance), and that the passage as a whole is also within this range.
- We have each of the passages reviewed by grade-level teachers and modify the wording if needed.
- We pilot the measures with actual students at that grade level (trying to time the pilot so it's as close to the middle of the year as possible) and make sure that alternate forms of the passages are at about the same level, in terms of the scores actual students receive on them when they take multiple passages across a short period of time (for these studies, we typically test in a school for four consecutive days, each day testing 5 passages with the same students. The students act as their own 'control' group, and we make sure that there is no significant difference between the difficulty of the alternate forms of the passages.
I hope that makes sense -- it's actually a pretty complicated process. If you'd like to read more detailed information you can find a variety of technical reports that discuss the measures in detail on our website: Technical Reports.