Saturday, January 30, 2010


I was brought up not to swear.

My parents grew up in rural and small town Texas Presbyterian homes. I have never heard a swear word pass the lips of either one. We just don't talk that way.

Neither, during the first 40+ years of my life, did I feel the need to break that habit. Most swearing, in my opinion, is just plain lazy. There are far more clever ways to express oneself. I've been doing my best to teach Parker that philosophy as well.

Lately, however, I'm starting to rethink. Clever turns of phrase suddenly seem thin in the face of some of the things people say about kids like Parker's Shamrock classmates.

For example, last week an Observer article noted that Charlotte's high poverty schools are far less likely than wealthier schools to offer extracurricular activities such as chess.

One reader remained unmoved. "Really, do you think a low income, city school is going to have any participation in a chess tournament?" he or she wrote in the Observer's e-forum. "Any kid that participated would be laughed at and shot!"

I know first-hand how much Shamrock's kids love chess, and how long the waiting list for chess club has been. I also know about the excellent competitive teams fielded by several of our high-poverty high schools. There's a word I'd like to use instead of "reader" here. I can say it in my head. But I can't bring myself to say it out loud, or to write it.

Then there's all the other stuff.

After Peter and I identified a glaring error in the research cited by a CMS consultant (see "Battles: Class Size"), the consultant shifted tactics. In a revised report, the assertion that "Research supports" the consultant's recommendations became "There is general consensus in research and reform communities" about those recommendations.

A couple of days later, a reporter asked the consultant for specifics. He replied that "there's not a lot of research that speaks to CMS's situation right now," but that he felt "pretty comfortable" about the recommendations nonetheless (for the full story, see the previous post).

There's a good, two-syllable, Anglo-Saxon word that sums this up with pinpoint exactitude. But I can't say it -- at least not yet. You'll have to do it yourself.

Thursday, January 28, 2010

Battles: Class Size II

When Peter and I uncovered the apples-to-fuzzy-little-squirrels error in a report presented to the CMS Board of Education by consultant Educational Resource Strategies (see previous post), we wrote an editorial for the Charlotte Observer (click here). A week later, the Observer reported on reactions from CMS and ERS. Sadly, it was just as we predicted.

CMS consultant admits goof, but says point is right
At issue: Best way to teach if budget must shrink

By Ann Doss Helms
Posted: Friday, Jan. 29, 2010

Two parent activists were right when they pointed out a research flaw in a Charlotte-Mecklenburg Schools consultant's report, that consultant told the school board this week.

But there's still room to argue over the significance.

The report is designed to guide CMS leaders in making tough budget choices while preserving academic quality.

Jonathan Travers, director of the nonprofit Education Resource Strategies, says that if officials have to slice the budget, they'd do better to cut some teachers, make classes slightly larger and invest in making the remaining teachers more effective.

In a Jan. 12 presentation to the school board, he cited research to support that premise.

In a Jan. 20 opinion piece published in the Observer, Pamela Grundy and Peter Wong called one of his research citations "worthless - or worse." Grundy and Wong, who are married and have a son at Shamrock Gardens Elementary, said a study cited as evidence that classroom coaching brought student gains never looked at student achievement.

Instead, it measured what teachers learned from academic coaches.

"These are our children. We fail to understand why their futures should be shaped by studies that ignore fundamental principles of responsible research," the pair wrote.

They say small classes should be preserved, especially in the early grades at high-poverty schools such as Shamrock.

In a follow-up session with the school board this week, Travers acknowledged the research should not have been used to make the case for student gains. But he told the board there's plenty of research to show effective teaching makes more difference to students than small changes in class size.

Superintendent Peter Gorman agreed. "The research has been clear for years: To have class size outweigh teacher effectiveness, the change has to be huge."

CMS is working with researchers and consultants, including ERS, to figure out how to measure and boost teacher effectiveness. Thursday night, Travers couldn't point to research showing that specific efforts, such as teacher training or coaching, lead to student gains.

"I mean, it's tough," he said. "There's not a lot of research that speaks to CMS's situation right now."

Nonetheless, he said, he's "pretty comfortable" recommending that CMS cut up to 200 teacher jobs to save up to $10 million, instead spending that money to provide expert support and other aids to teacher effectiveness.

Grundy said this week that she doesn't believe the research clearly indicates shifting money from small classes to teacher support would help kids. She said at Shamrock, where students have made gains on state exams, good teachers are staying longer.

CMS spokeswoman Kathleen Johansen said the Bill and Melinda Gates Foundation pays ERS for its consulting services to CMS; the amount was not available Thursday.

Tuesday, January 19, 2010

Battles: Class Size

As Shamrock parents, we often find that we're at odds with "experts" of various kinds. Education these days is supposed to be "data driven." This means if school officials propose a policy, they have to offer "data" which shows that policy will work. Often, they do this by hiring fancy consulting firms that are supposed to analyze the studies that have been done on various subjects, and make "data driven" recommendations.

Such recommendations are always slickly presented, with nifty graphics, lots of colors and arrows pointing everywhere. But their content is often far less impressive. Such reports, we've learned through experience, frequently recommend the latest policy fads, whether or not the research supports them.

This is bad for advocates of small-size classes, like we have at Shamrock now. Small classes are definitely not "in" these days. The hot thing is teacher training and performance. The educational efficiency folks currently favored by big funders like the Gates Foundation don't like small classes, because they're expensive. Instead, they're pushing changes they say will make teaching more efficient -- new training, "pay-for-performance," etc.

And rather than suggesting that school districts improve teaching while keeping small classes, they're arguing that class size doesn't really matter. In fact, some suggest that having small classes undercuts the strategy of improving teaching, because smaller classes mean you need more good teachers.

So those of us who have seen small classes work at our schools feel under siege. We have to fight back -- intellectually and politically.

The intellectual part is frustrating, because it isn't really very hard. On a lot of these subjects, the data isn't all that good. It's easy to manipulate, and easy to spot manipulation.

Last week, for example, the consulting group Educational Resource Strategies (ERS) presented recommendations for cutting the CMS budget. Class size was target number one.

The study mustered an apparently impressive set of facts and figures to make its case. In an especially dramatic flourish, the authors asserted that "Other investments – with similar or less expense than dramatic class size reductions – show even more significant gains in student performance." (In plainer English, this means they think that other strategies give you more bang for the buck than small classes.)

It went on to argue that: "[Professional development] with classroom instructional coaches showed a performance increase of 1.25 to 2.70 of a standard deviation, versus reducing class size to 15 in grades K-3 [which] showed a performance increase of 0.25 of a standard deviation." (Again in plainer English, they're saying that coaching teachers is up to ten times more effective than small classes -- 2.70 vs. .25 -- in improving student performance.)

Pretty persuasive! Why spend money reducing class size when you can get ten times the results with classroom instructional coaches!

But those numbers sounded too good to be true. So I looked a little closer. After just a couple of hours of digging it was clear that the comparison was worthless.

It turns out that those classroom coaching numbers -- that impressive "1.25 to 2.70 of a standard deviation" -- measured how much more TEACHERS learned about TEACHING when classroom coaches helped them learn and practice new teaching techniques. Despite what the ERS report claimed, the numbers had NOTHING to do with student performance.

This wasn't an apples to apples comparison. It wasn't even an apples to oranges comparison. It was on the order of an apples to fuzzy little squirrels comparison.

This is the point when I give up on education and start to rant about the decline and fall of the United States of America. Here are well-paid consultants who supposedly know what they're doing. What they do and say has a major impact on other people's future -- in this case, the kids of CMS. But their work has no foundation. They're just like those well-paid bankers in fancy suits who wrecked the economy with their bad loans. We are a nation that has forgotten how to do things responsibly and well! Our stint at the top of the world is over!

To get back to the report, one can question whether the bad comparison represents a dumb mistake, an calculated misrepresentation or a fundamental lack of rigor in ERS research. Personally, I think it was the latter, although I'm more suspicious of some other problems in the report.

So, what will happen? My guess is not so much. Errors like this make me mad, because I hate sloppy work, I believe in small classes and I do think that dumb stuff like this is sending our country down the tubes. But I don't know how many other people will feel so strongly.

I don't think ERS can really defend comparing apples to fuzzy little squirrels. But excuses will be made -- and probably accepted. They'll likely pull that piece out of the report, and stand by their recommendations anyway (although I don't think that their other evidence is much better). Will anyone get seriously reprimanded -- or even fired? Will they change the way they work? Probably not.

Even more to the point, will our class sizes go up? That's the political part. Stay tuned.

To see the full ERS presentation, click here: boardhighlights. Scroll to the bottom of the page and then click on the link that says "Click here to view the presentation from ERS."

Wednesday, January 6, 2010


I compromised a principle today.

I wrote Shamrock a check for just over $60 to pay for 20 Accelerated Reader (AR) tests.

For those of you who haven't had the pleasure of working with AR, it's a computerized system that allows students to take standardized tests on books they've read. They get points for every test they pass.

The book are rated according to sentence and word length, and the longer the book the more points the test is worth. The Very Hungry Caterpillar has a difficulty rating of 2.9 and a point value of .5. Harry Potter and the Order of the Phoenix has a difficulty rating of 7.2 and a whopping point value of 44.

Accelerated Reader is a perfect match for the current educational mentality – packaged, computerized and measured to an impressive-looking tenth of a point. Scores can be endlessly analyzed, compared, graphed and regressed.

Like many schools, Shamrock makes AR a big deal. Students have individual reading goals for every quarter, with a big party for those who reach their marks. At the end of the year, the school bestows awards upon the top point-getters in every grade.

But Accelerated Reader has its problems. Most obvious, its multiple-choice tests are strictly factual (some parents would say trivial). A sample question for Harry Potter and The Order of the Phoenix might be: "When Harry and Cho Chang went to Hogsmeade for Valentine's Day, they stopped at a) The Three Broomsticks b) Zonko's Joke Shop c) Madame Puddifoot's or d) Honeyduke's." (Warning to potential cheaters: I made this one up.)

Basically, the tests measure whether a student has actually read the words in a book, not whether he or she has understood the themes, character development, descriptive power, or anything else that matters.

Our marvelous media center specialist occasionally points out that her take on AR is significantly shaded by the fact that The Color Purple and The Poky Little Puppy have identical difficulty ratings of 4.0 (although, admittedly, a student would get 9 points for reading The Color Purple, and only .5 for The Poky Little Puppy.)

These ratings cause their own complications. It is far too easy for teachers to use AR numbers to micro-manage students' reading -- requiring students to read only those books within a specified range. Three years ago, a teacher who thankfully is no longer at our school told one of Parker's friends that he wasn't allowed to read Harry Potter and the Goblet of Fire because its difficulty level of 6.8 lay above his designated range (he read it anyway, took the test on the sly, and notched a perfect score, which earned him 32 points and a scolding from the annoyed instructor).

The ratings can also trip up high-flying students, whose mastery of high-difficulty books such as John Madden's Heroes of Football (7.5) can lead them to shun works with lower difficulty levels but far greater thematic richness, works such as Madeleine L'Engle's A Wrinkle in Time (4.7), Lois Lowry's Number the Stars (4.5), or Ernest Hemingway's The Sun Also Rises (4.4!).

And at a school like ours, where funds are at a premium, student reading can be limited by the number of tests that the school can afford to buy (as with so much else in modern education, measuring student progress means big profits for the companies that can snare a piece of the action). In a world of educational equality, we'd do what wealthier schools do and buy the program that gives students access to every AR test there is, so they could get credit for reading anything and everything that strikes their fancy. But that setup costs several thousand dollars, plus a yearly license fee. It's beyond our reach.

For these and other reasons, I don't really like AR. But given all the things we have to fight for at Shamrock, challenging AR's privileged position isn't my top priority. And Parker's started to read the Percy Jackson books – The Lightning Thief, The Sea of Monsters, and so on. I think they're great, and I know that if the school has all the tests, more of our students will be encouraged to read them as well.

So I've written out the check (you have to buy at least 20 at a time, so I'm throwing in tests for the Artemis Fowl books, the Diary of a Wimpy Kid books and a few others as well).

Like our media specialist, I'd rather buy books than tests. But sometimes you have to compromise.