School Information System
Newsletter Sign Up |

Subscribe to this site via RSS: | Newsletter signup | Send us your ideas

April 14, 2009

Examining District Data on the Effects of PBIS

As noted in an earlier post, the school district presented data at Monday night's meeting on the effects of implementing a strategy of Positive Behavioral Interventions and Support (PBIS). As the report notes, "Documenting behavior referrals is inconsistent across middle schools both in terms of what is recorded and where it is recorded." While this makes it unwise to make comparisons across middle schools, as one school may refer students who are late to class while another only makes referrals as a consequence of fighting, it is valid to make comparisons across time within the same school in order to see what effect the implementation of PBIS has made on student behavior. Unfortunately, as readers of the report will observe, not even that data is consistently presented across the 11 middle schools where PBIS has been implemented. Some schools only have data for the current academic year, others only have data from February 2008 through February 2009, and others provide more.

While the behavioral scientist in me wants to comment on the parts of the report that are incomprehensible (the self-assessment survey schoolwide system analysis from each school) or redundant (providing charts that show time saved in both minutes, in hours, and in days), I will restrict my comments to the data that documents the effects of the implementation of PBIS. While there have been some impressive successes with PBIS, e.g., Sherman, there have also been failures, e.g., Toki. One interpretation would be that some schools have been successful implementing these strategies and we need to see what they are doing that has led to their success, another interpretation would be that PBIS has by and large failed and resulted in an increase in behavioral referrals across our middle schools. At this point, I'll take the middle ground and say that this new approach to dealing with student behavior hasn't made any difference. You can look at the table below and draw your own conclusions, Keep in mind though, as noted above, there is not consistency across schools in what sorts of behavioral problems get documented. It is also true that there is considerable variability in the absolute number of referrals across the 11 middle schools and across months, such that a 30% change in the number of behavioral referrals may reflect 45 referrals at Blackhawk, 10 referrals at Wright, and 170 referrals at Toki.

MMSD Behavioral Referral Data (presented 4/13/09)
Comparison Data Provided Schools Results: Change from 07/08 to 08/09*
None

Cherokee

Jefferson

Whitehorse

 
One month only (February)

Blackhawk

30% decline (decrease of 40 referrals)

O'Keefe 10% decline (decrease of 10 referrals)
Spring Harbor 35% increase (increase of 8 referrals)
Wright 20% increase (increase of 7 referrals)
Six months (Sept. - Feb.)

Hamilton

Declines in Sept. (20%), Nov. (20%) and Dec. (10%); Increases October (40%) and Jan. (20%); No change in February

Sennett

Increases every month ranging from 5% (Dec.) to 75% (Feb.), median increase in referrals - 20%

Toki

Increases every month ranging from 7% (Nov.) to 200% (Sept), median increase in referrals - 68%

Multiple years Sherman Decreases every month ranging from 30% (Feb.) to 70% (Oct., a drop of more than 250 referrals), median decrease in behavioral referrals - 42%
* Note that these percentages are approximate based on visual inspection of the charts provided by MMSD Posted by Jeff Henriques at April 14, 2009 11:28 AM
Subscribe to this site via RSS/Atom: Newsletter signup | Send us your ideas