Legislation and Reading: The Wisconsin Experience 2004-

Emily Hanford notes the “surge in legislative activity” amidst our long term, disastrous reading results [link]. Longtime SIS readers may recall a few of these articles, bookmarking our times, so to speak: 2004: [Link] “In 2003, 80% of Wisconsin fourth graders scored proficient or advanced on the WCKE in reading. However, in the same year … Continue reading Legislation and Reading: The Wisconsin Experience 2004-

Underly: “I support Eliminating the Foundations of Reading (FORT)” Teacher Test

Transcript [Machine Generated PDF]: Deborah Kerr: [00:43:53] Um, whose turn is it to go first? Okay. That’s fine. Yeah, we’re pretty good at figuring this out. Um, [00:44:00] so that’s one thing we can do. Um, yes, I support the FORT. I fo I support the Praxis test. So you gotta think about something. Why … Continue reading Underly: “I support Eliminating the Foundations of Reading (FORT)” Teacher Test

Teacher Mulligans, continued: The latest report on reading was really bad. Here are some possible solutions

Alan Borsuk: Mississippi got a lot of attention when the NAEP scores were released. It was the only state where fourth grade reading scores improved. Mississippi is implementing a strong requirement that teachers be well-trained in reading instruction. Massachusetts did that in the 1990s and it paid off in the following decade. Wisconsin passed a … Continue reading Teacher Mulligans, continued: The latest report on reading was really bad. Here are some possible solutions

My Question to Wisconsin Governor Tony Evers on Teacher Mulligans and our Disastrous Reading Results

Wiseye @ 24 September WisPolitics Lunch: Jim Zellmer: Thank you for your service Governor Evers. Under your leadership, the Wisconsin d.p.i. granted Mulligan’s to thousands of elementary teachers who couldn’t pass a reading exam (that’s the “Foundations of Reading” elementary teacher reading content knowledge exam), yet our students lag Alabama, a state that spends less … Continue reading My Question to Wisconsin Governor Tony Evers on Teacher Mulligans and our Disastrous Reading Results

Commentary on Wisconsin K-12 Governance and the November, 2018 Election

<a href=”https://madison.com/ct/news/local/education/democratic-legislators-look-to-make-big-changes-to-state-education/article_882a0ddd-3671-5769-b969-dd9d2bc795db.html”>Negassi Tesfamichael</a>: <blockquote> Many local Democratic state legislators say much of the future of K-12 education in Wisconsin depends on the outcome of the Nov. 6 election, particularly the gubernatorial race between state superintendent Tony Evers, a Democrat, and Republican Gov. Scott Walker. Legislators spoke at a forum at Christ Presbyterian Church Wednesday night, … Continue reading Commentary on Wisconsin K-12 Governance and the November, 2018 Election

“Less discussed in Wisconsin is the tremendous impact that economic status has on student achievement”

Will Flanders: Less discussed in Wisconsin is the tremendous impact that economic status has on student achievement. A school with a population of 100% students who are economically disadvantaged would be expected to have proficiency rates more than 40% lower than a school with wealthier students. Indeed, this economics achievement gap is far larger in … Continue reading “Less discussed in Wisconsin is the tremendous impact that economic status has on student achievement”

Wisconsin DPI: “We set a high bar for achievement,” & abort Foundations of Reading Teacher Content Knowledge Requirement}

Molly Beck and Erin Richards: “We set a high bar for achievement,” DPI spokesman Tom McCarthy said. “To reach more than half (proficiency), we would need to raise the achievement of our lowest district and subgroup performers through policies like those recommended in our budget, targeted at the large, urban districts.” The new scores reveal … Continue reading Wisconsin DPI: “We set a high bar for achievement,” & abort Foundations of Reading Teacher Content Knowledge Requirement}

On Wisconsin’s (and Madison’s) Long Term, Disastrous Reading Results

Alan Borsuk: But consider a couple other things that happened in Massachusetts: Despite opposition, state officials stuck to the requirement. Teacher training programs adjusted curriculum and the percentage of students passing the test rose. A test for teachers In short, in Wisconsin, regulators and leaders of higher education teacher-prep programs are not so enthused about … Continue reading On Wisconsin’s (and Madison’s) Long Term, Disastrous Reading Results

Support modifications to the Wisconsin PI-34 educator licensing rule

Wisconsin Reading Coalition E-Alert: We have sent the following message and attachment to the members of the Joint Committee for Review of Administrative Rules, urging modifications to the proposed PI-34 educator licensing rule that will maintain the integrity of the statutory requirement that all new elementary, special education, and reading teachers, along with reading specialists, … Continue reading Support modifications to the Wisconsin PI-34 educator licensing rule

Requesting action one more time on Wisconsin PI-34 teacher licensing

Wisconsin Reading Coalition, via a kind email: Thanks to everyone who contacted the legislature’s Joint Committee for Review of Administrative Rules (JCRAR) with concerns about the new teacher licensing rules drafted by DPI. As you know, PI-34 provides broad exemptions from the Wisconsin Foundations of Reading Test (FORT) that go way beyond providing flexibility for … Continue reading Requesting action one more time on Wisconsin PI-34 teacher licensing

Wisconsin DPI efforts to weaken the Foundations of Reading Test for elementary teachers

Wisconsin Reading Coalition, via a kind email: Wisconsin Reading Coalition has alerted you over the past 6 months to DPI’s intentions to change PI-34, the administrative rule that governs teacher licensing in Wisconsin. We consider those changes to allow overly-broad exemptions from the Wisconsin Foundations of Reading Test for new teachers. The revised PI-34 has … Continue reading Wisconsin DPI efforts to weaken the Foundations of Reading Test for elementary teachers

Wisconsin Elementary Teacher Content Knowledge Exam Results (First Time Takers)

Foundations of Reading Test (Wisconsin) Result Summary (First Time Takers): May 2013 – August 2014 (Test didn’t start until January 2014, and it was the lower cut score): 2150 pass out of 2766 first time takers = 78% passage rate .xls file September 2014 – August 2015 (higher cut score took effect 9/14): 2173/3278 = … Continue reading Wisconsin Elementary Teacher Content Knowledge Exam Results (First Time Takers)

‘Emergency’ effort to address teacher shortages reflect larger education issues

Alan Borsuk: t’s an emergency. It says so right there on the legal papers: “Order of the State Superintendent for Public Instruction Adopting Emergency Rules.” But it’s a curious kind of emergency. Elsewhere in the paperwork, it uses the term “difficulties.” Maybe that’s a better way to put it. Underlying the legal language lie questions … Continue reading ‘Emergency’ effort to address teacher shortages reflect larger education issues

Wisconsin hopes to mirror Massachusetts’ test success for teaching reading

Alan Borsuk:

A second-grade teacher notices that one of her students lacks fluency when reading aloud. The first thing the teacher should do to help this student is assess whether the student also has difficulties with:
A. predicting
B. inferring
C. metacognition
D. decoding
Don’t worry if you’re not into metacognition. The correct answer is decoding — at least according to the people who put together the test teachers must pass in Massachusetts if they are going to teach children to read.
The Massachusetts test is about to become the Wisconsin test, a step that advocates see as important to increasing the quality of reading instruction statewide and, in the long term, raising the overall reading abilities of Wisconsin students. As for those who aren’t advocates (including some who are professors in schools of education), they are going along, sometimes with a more dubious attitude to what this will prove.
The Wisconsin Department of Public Instruction officially launched the era of the new test for reading licenses with a memo sent last week to heads of all teacher preparation programs in the state. The memo spelled out the details of implementing a law passed in 2011 that called for Wisconsin to use the Massachusetts test. The memo included setting the passing score, which, after a short phase-in period, will match what is regarded as the demanding Massachusetts standard.
In a nutshell, after Jan. 31, 2014, anyone who wants to get a license that allows them to teach reading in Wisconsin will have to pass this test, with 100 multiple choice questions and two essay questions, aimed at making sure they are adequately prepared to do so. (Those currently licensed will not need to pass the test.)
Why Massachusetts? Because in the 1990s, Massachusetts launched initiatives, including requiring students to pass a high school graduation test, requiring teachers to pass licensure tests specific to the subjects they teach, and increasing spending on education, especially in schools serving low-income children.
At that point, Wisconsin and Massachusetts were pretty much tied, and down the list of states a bit, when it came to how students were doing. Within a few years, scores in Massachusetts rose significantly. The state has led the nation in fourth- and eighth-grade reading and math achievement for a decade. Wisconsin scores have stayed flat.

Many notes and links on Wisconsin’s adoption of Massachusetts (MTEL) elementary English teacher content knowledge standards. UW-Madison Professor Mark Seidenberg’s recommended Wisconsin’s adoption of MTEL.

Implementation of Wisconsin’s Statutory Screening Requirement

Wisconsin Reading Coalition, via a kind email [170K PDF]:

The selection of an early reading screener for Wisconsin is a decision of critical importance. Selecting the best screener will move reading instruction forward statewide. Selecting a lesser screener will be a missed opportunity at best, and could do lasting harm to reading instruction if the choice is mediocre or worse.
After apparently operating for some time under the misunderstanding that the Read to Lead Task Force had mandated the Phonological Assessment and Literacy Screen (PALS), the Department of Public Instruction is now faced with some time pressure to set up and move through a screener evaluation process. Regardless of the late start, there is still more than enough time to evaluate screeners and have the best option in place for the beginning of the 2012-13 school year, which by definition is the time when annual screeners are administered.
The list of possible screeners is fairly short, and the law provides certain criteria for selection that help limit the options. Furthermore, by using accepted standards for assessment and understanding the statistical properties of the assessments (psychometrics), it is possible to quickly reduce the list of candidates further.
Is One Screener Clearly the Best?
One screener does seem to separate itself from the rest. The Predictive Assessment of Reading (PAR) is consistently the best, or among the best, in all relevant criteria. This comment is not a comparison of PAR to all known screeners, but comparing PAR to PALS does reveal many of its superior benefits.
Both PAR and PALS assess letter/sound knowledge and phonemic awareness, as required by the statute.
In addition, PAR assesses the important areas of rapid naming and oral vocabulary. To the best of our knowledge, PAR is the only assessment that includes these skills in a comprehensive screening package. That extra data contributes unique information to identify children at risk, including those from low-language home environments, and consequently improves the validity of the assessment, as discussed below.
Both PAR and PALS have high reliability scores that meet the statutory requirement. PAR (grades K-3) scores .92, PALS-K (kindergarten) scores .99, and PALS (grades 1-3) scores .92. Reliability simply refers to the expected uniformity of results on repeated administrations of an assessment. A perfectly reliable measurement might still have the problem of being consistently inaccurate, but an unreliable measurement always has problems. Reliability is necessary, but not sufficient, for a quality screener. To be of value, a screener must be valid.
In the critical area of validity, PAR outscores PALS by a considerable margin. Validity, which is also required by the statute, is a measure of how well a given scale measures what it actually intends to measure; leaving nothing out and including nothing extra. In the case of a reading screener, it is validity that indicates how completely and accurately the assessment captures the reading performance of all students who take it. Validity is both much harder to achieve than reliability, and far more important.
On a scale of 0-1, the validity coefficient (r-value) of PAR is .92, compared to validity coefficients of .75 for PALS-K and .68 for PALS. It is evident that PAR outscores PALS-K and PALS, but the validity coefficients by themselves do not reveal the full extent of the difference. Because the scale is not linear, the best way to compare validity coefficients is to square them, creating r-squared values. You can think of this number as the percentage of success in achieving accurate measurement. Measuring human traits and skills is very hard, so there is always some error, or noise. Sometimes, there is quite a lot.
When we calculate r-squared values, we get .85 for PAR, .56 for PALS-K, and .46 for PALS. This means that PAR samples 51 to 84 percent more of early reading ability than the PALS assessments. The PALS assessments measure about as much random variance (noise) as actual early reading ability. Validity is not an absolute concept, but must always be judged relative to the other options available in the current marketplace. Compared to some other less predictive assessments, we might conclude that PALS has valid performance. However, compared to PAR, it is difficult to claim that PALS is valid, as required by law.
PAR is able to achieve this superior validity in large part because it has used 20 years of data from a National Institutes of Health database to determine exactly which sub-tests best predict reading struggles. As a consequence, PAR includes rapid naming and oral vocabulary, while excluding pseudo-word reading and extensive timing of sub-tests.
PAR is norm-referenced on a diverse, national sample of over 14,000 children. That allows teachers to compare PAR scores to other norm-referenced formative and summative assessments, and to track individual students’ PAR performance from year to year in a useful way. Norm referencing is not required by the statute, but should always be preferred if an assessment is otherwise equal or superior to the available options. The PALS assessments are not norm-referenced, and can only classify children as at-risk or not. Even at that limited task of sorting children into two general groups, PAR is superior, accurately classifying children 96% of the time, compared to 93% for PALS-K, and only 73% for PALS.
PAR provides the unique service of an individualized report on each child that includes specific recommendations for differentiated instruction for classroom teachers. Because of the norm-referencing and the data base on which it was built, PAR can construct simple but useful recommendations as to what specific area is the greatest priority for intervention, the intensity and duration of instruction which will be necessary to achieve results, and which students may be grouped for instruction. PAR also provides similar guidance for advanced students. With its norm-referencing, PAR can accurately gauge how far individual children may be beyond their classmates, and suggest enriched instruction for students who might benefit. Because they are not norm-referenced, the PALS assessments can not differentiate between gray-area and gifted students if they both perform above the cut score.
PAR costs about the same as PALS. With bulk discounts for statewide implementation, it will be possible to implement PAR (like many other screeners) at K5, 1st grade, 2nd grade, and possibly 3rd grade with the funds allocated by statute for 2012-13. While the law only requires kindergarten screening at this time, the goal is to screen other grades as funds allow. The greatest value to screening with a norm-referenced instrument comes when we screen in several consecutive years, so the sooner the upper grades are included, the better.
PAR takes less time to administer than PALS (an average of 12-16 minutes versus 23-43).
The procurement procedure for PALS apparently can be simplified because it would be a direct purchase from the State of Virginia. However, PAR is unique enough to easily justify a single-source procurement request. Salient, essential features of PAR that would be likely to eliminate or withstand a challenge from any other vendor include demonstrated empirical validity above .85, norm-referencing on a broad national sample, the inclusion of rapid naming and oral vocabulary in a single, comprehensive package, empirically valid recommendations for differentiated intervention, guidance on identifying children who may be gifted, and useful recommendations on grouping students for differentiated instruction.
Conclusions
The selection of a screener will be carefully scrutinized from many perspectives. It is our position that a single, superior choice is fairly obvious based on the facts. While it is possible that another individual or team may come to a different conclusion, such a decision should be supported by factual details that explain the choice. Any selection will have to be justified to the public as well as specific stakeholders. Some choices will be easier to justify than others, and explanations based on sound criteria will be the most widely accepted. Simple statements of opinion or personal choice, or decisions based on issues of convenience, such as ease of procurement, would not be convincing or legitimate arguments for selecting a screener. On the other hand, the same criteria that separate PAR from other screeners and may facilitate single-source procurement also explain the choice to the public and various stakeholder groups. We urge DPI to move forward reasonably, deliberately, and expeditiously to have the best possible screener in place for the largest possible number of students in September.

Although there is still a long way to go on improving reading scores, Brown Deer schools show that improvements can be made. by the Milwaukee Journal-Sentinel:

There are signs that the long struggle to close the achievement gap in reading has a chance of paying off. There is a long way to go – and recent statewide test scores were disappointing – but we see some reason for encouragement, nonetheless.
Alan J. Borsuk, a former Journal Sentinel education reporter and now a senior fellow in law and public policy at Marquette University Law School, reports that black 10th-graders in the Brown Deer school district did better in reading than Wisconsin students as a whole, with 84.2% of Brown Deer’s black sophomores rated proficient or advanced in reading, compared with 78.1% for all students and 47.7% for all black 10th-graders in the state. Some achievement gaps remain in this district that is less than one-third white, but they are relatively modest.

Schools are working to improve reading

As vice chair of the Read to Lead Task Force, I am pleased that Wisconsin is already making progress on improving literacy in Wisconsin.
The Read to Lead Task Force members deserve credit for making recommendations that center on improving reading by: improving teacher preparation and professional development; providing regular screening, assessment and intervention; ensuring early literacy instruction is part of early childhood programs; and strengthening support for parental involvement in reading and early literacy programs.
Across Wisconsin, districts and schools are working to implement the Common Core State Standards in English language arts and mathematics. These standards are designed to increase the relevance and rigor of learning for students. Milwaukee’s Comprehensive Literacy Plan is a significant step that defines common expectations in reading for Milwaukee Public Schools students, who now receive reading instruction through one curriculum that is consistent across schools.

Learn more about Wisconsin’s Read to lead Task force and the planned MTEL teacher content knowledge standards, here.
www.wisconsin2.org.

More Commentary on Proposed Wisconsin Teacher Licensing Content Requirements.

Alan Borsuk

In 1998, Massachusetts debuted a set of tests it created for people who wanted teaching licenses. People nationwide were shocked when 59% of those in the first batch of applicants failed a communications and literacy test that officials said required about a 10th-grade level of ability.
Given some specifics of how the tests were launched, people who wanted to be teachers in Massachusetts probably got more of a bum rap for their qualifications than they deserved. But the results certainly got the attention of people running college programs to train teachers. They changed what they did, and the passing rate rose to about 90% in recent years.
One more thing: Student outcomes in Massachusetts improved significantly. Coming from the middle of the pack, Massachusetts has led the nation in fourth- and eighth-grade scores in reading and math on National Assessment of Education Program (NAEP) tests for almost a decade.
Could this be Wisconsin in a few years, especially when it comes to reading?
Gov. Scott Walker and state Superintendent of Public Instruction Tony Evers released last week the report of a task force aimed at improving reading in Wisconsin. Reading results have been stagnant for years statewide, with Wisconsin slipping from near the top to the middle of the pack nationally. Among low-income and minority students, the state’s results are among the worst in the country.