There are some spicy takes on i-Ready I’ve seen recently: Kids hate it, parents hate it, teachers were bamboozled into becoming what amounts to spokesmodels, there’s no data to show it’s working, and experts agree that it’s overselling and underdelivering what it’s calling “differentiated instruction.” What I’m seeing in many of those posts, though, is a hedge like: “It’s a decent screener…”
Here’s the thing, though: It’s not.
And states are quietly starting to go on record about it.
The state’s Reading Difficulties Risk Screener Selection Panel (a body created specifically to vet screening tools under California’s new dyslexia screening mandate) published its approved list for the 2025–26 school year.
Four tools made the cut: Amira, DIBELS 8, Multitudes(out of UCSF’s Dyslexia Center) and ROAR (an open source solution out of Stanford).
i-Ready, the biggest player in the room, did not.
Under Public Acts 146 and 147 (Michigan’s new K–12 literacy and dyslexia laws) the Michigan Department of Education was required to publish a list of valid and reliable K–3 screening and progress-monitoring assessments by January 1, 2026.
They dropped it two weeks early and its document came with two lists: the tools the state trusts to find kids with reading difficulties (Amira, MAP Reading Fluency, and DIBELS 8), and the tools that didn’t clear the bar. i-Ready landed on the second list. According to the published evaluation (summarized at the end of the post), it didn’t meet quite a few of the state’s required criteria.