As Shamrock parents, we often find that we're at odds with "experts" of various kinds. Education these days is supposed to be "data driven." This means if school officials propose a policy, they have to offer "data" which shows that policy will work. Often, they do this by hiring fancy consulting firms that are supposed to analyze the studies that have been done on various subjects, and make "data driven" recommendations.
Such recommendations are always slickly presented, with nifty graphics, lots of colors and arrows pointing everywhere. But their content is often far less impressive. Such reports, we've learned through experience, frequently recommend the latest policy fads, whether or not the research supports them.
This is bad for advocates of small-size classes, like we have at Shamrock now. Small classes are definitely not "in" these days. The hot thing is teacher training and performance. The educational efficiency folks currently favored by big funders like the Gates Foundation don't like small classes, because they're expensive. Instead, they're pushing changes they say will make teaching more efficient -- new training, "pay-for-performance," etc.
And rather than suggesting that school districts improve teaching while keeping small classes, they're arguing that class size doesn't really matter. In fact, some suggest that having small classes undercuts the strategy of improving teaching, because smaller classes mean you need more good teachers.
So those of us who have seen small classes work at our schools feel under siege. We have to fight back -- intellectually and politically.
The intellectual part is frustrating, because it isn't really very hard. On a lot of these subjects, the data isn't all that good. It's easy to manipulate, and easy to spot manipulation.
Last week, for example, the consulting group Educational Resource Strategies (ERS) presented recommendations for cutting the CMS budget. Class size was target number one.
The study mustered an apparently impressive set of facts and figures to make its case. In an especially dramatic flourish, the authors asserted that "Other investments – with similar or less expense than dramatic class size reductions – show even more significant gains in student performance." (In plainer English, this means they think that other strategies give you more bang for the buck than small classes.)
It went on to argue that: "[Professional development] with classroom instructional coaches showed a performance increase of 1.25 to 2.70 of a standard deviation, versus reducing class size to 15 in grades K-3 [which] showed a performance increase of 0.25 of a standard deviation." (Again in plainer English, they're saying that coaching teachers is up to ten times more effective than small classes -- 2.70 vs. .25 -- in improving student performance.)
Pretty persuasive! Why spend money reducing class size when you can get ten times the results with classroom instructional coaches!
But those numbers sounded too good to be true. So I looked a little closer. After just a couple of hours of digging it was clear that the comparison was worthless.
It turns out that those classroom coaching numbers -- that impressive "1.25 to 2.70 of a standard deviation" -- measured how much more TEACHERS learned about TEACHING when classroom coaches helped them learn and practice new teaching techniques. Despite what the ERS report claimed, the numbers had NOTHING to do with student performance.
This wasn't an apples to apples comparison. It wasn't even an apples to oranges comparison. It was on the order of an apples to fuzzy little squirrels comparison.
This is the point when I give up on education and start to rant about the decline and fall of the United States of America. Here are well-paid consultants who supposedly know what they're doing. What they do and say has a major impact on other people's future -- in this case, the kids of CMS. But their work has no foundation. They're just like those well-paid bankers in fancy suits who wrecked the economy with their bad loans. We are a nation that has forgotten how to do things responsibly and well! Our stint at the top of the world is over!
To get back to the report, one can question whether the bad comparison represents a dumb mistake, an calculated misrepresentation or a fundamental lack of rigor in ERS research. Personally, I think it was the latter, although I'm more suspicious of some other problems in the report.
So, what will happen? My guess is not so much. Errors like this make me mad, because I hate sloppy work, I believe in small classes and I do think that dumb stuff like this is sending our country down the tubes. But I don't know how many other people will feel so strongly.
I don't think ERS can really defend comparing apples to fuzzy little squirrels. But excuses will be made -- and probably accepted. They'll likely pull that piece out of the report, and stand by their recommendations anyway (although I don't think that their other evidence is much better). Will anyone get seriously reprimanded -- or even fired? Will they change the way they work? Probably not.
Even more to the point, will our class sizes go up? That's the political part. Stay tuned.
To see the full ERS presentation, click here: boardhighlights. Scroll to the bottom of the page and then click on the link that says "Click here to view the presentation from ERS."
Subscribe to:
Post Comments (Atom)
No comments:
Post a Comment