The other day, while wandering around the Charlotte Observer's website, I came across an intriguing number. On last year’s CMS student survey, 67 percent of Shamrock’s students agreed with the statement: “My teachers make school work interesting.”
That 67 percent was the second highest score among CMS elementaries and the third highest in the district. Points for the underdog! Kudos to our teachers for raising test scores without becoming deadly dull!
The more surprising number, though, was the average elementary response – a mere 36 percent. Middle and high schools averaged an even more dismal 15 percent. What is going on? Do CMS students really think their teachers are that boring, or does it just seem cool to say they are?
(Charts and graphs are from the Charlotte Observer. For a comparison graph of the "my teacher makes school work interesting" data, see: http://www.charlotteobserver.com/static/includes/datapages/cmsprofile/ss09_inter.htm)
The current obsession with educational data means there’s plenty of it around – test scores, surveys, “school quality reviews,” and much much more.
In a large district, where few people have direct experience of more than a handful of schools, this data takes on enormous importance. Yet trying to figure out what it really means can be a mind-bending exercise. You can waste endless hours playing with the numbers.
Last year, for example, I spent some time comparing survey and test score data from our school with that from a far wealthier elementary (a poverty rate of 24 percent as opposed to our 86 percent).
While the wealthy school had a considerably higher percentage of students on grade level, a look at some of the more challenged groups of students showed a different picture.
The wealthy school's percentage of low-income students on grade level barely topped ours (40 percent vs. 39). A higher percentage of our African American students scored at the tests' top levels. I couldn’t compare the performance of ESL students, because the wealthy school didn’t have any.
Yet when asked to rate the “overall effectiveness” of the school, 97 percent of the wealthy school's teachers rated it as excellent or good. Only 50 percent of Shamrock’s teachers rated our school that highly. What do these differences mean? Who has higher standards? Which is the better school?
One of the problems of all this data is that it's easiest to use it for simple comparisons, as parents do when deciding between schools. If a school is near the top of a list, we assume, surely it must be better than those in the middle or near the bottom.
Data also has more powerful effects when it confirms what you already think you know.
If the bottom rung of the "my teachers make school work interesting" graph were occupied by low-wealth, low-scoring Ashley Park, many would no doubt find it easy to imagine classes of restless students being drilled by a weary, frazzled staff.
That the lowest score (14 percent) in fact belongs to high-wealth, high scoring Endhaven makes it a little harder to jump to simple conclusions.
Happily, Shamrock's high score confirmed what I already knew, that our teachers pour their hearts into their work. So I'm going to put questions aside, and tell everyone I know.
To play around with some of this data yourself, go to: http://www.charlotteobserver.com/education/ and choose a school under the "Searchable Data on CMS Schools" section. I hear rumors that CMS also has some data on its much-touted "Data Dashboard," but since the dashboard won't let Macs access it, I can't confirm them.