Friday, March 12, 2010

But the Data Says I Suck!

This is sort of in response to Miss Eyre's post on the NYC Educator blog. I admire Miss Eyre for many reasons, and most recently because she took a dim view of the new teacher data reports despite a score that put her at the head of the class, if you will. Me, I take a dim view because the score I got earned me a dunce cap. That is, if you believe that data, which I don't.

When I first opened that email, my heart sank. I thought maybe they had emailed me the square root of my actual number by mistake.

Now, I've been teaching a long time, and I have a long track record of excellent results, so this number was a real shocker. I know how hard I work, and I know that my administration holds me in pretty high regard, so it just didn't make any sense. But there it was.

I went to work the next day hoping that no one would ask my results. As I slunk past friends' classrooms, I felt as if a large neon sign with my pathetic number and the words "You Suck!" was blinking above my head. But an odd thing happened. No one asked me my number, and no one told me theirs. I thought maybe word of my diminutive digits had leaked somehow and my colleagues were just being tactful. I mustered the courage to tell a good friend what I'd gotten, and I was shocked when he told me his. It was even worse.

Now, this guy is a good teacher. I sure didn't believe his number could be real. As the day passed, I spoke with a few other colleagues, and it turned out that I was far from the bottom performer in my school. By the time the day ended, I felt like Mr. Chips.

Here's the rub. If you believe the data, almost everyone in my school is a bum and a slacker. But the truth is I work at one of this highest rated schools in the city according to the state test results. And it has been one of the best schools for many years. So how did a bunch of rubes like us, the sum of whose teacher data numbers total up to a single decent teacher, manage to produce some of the best results in the city?

The answer is simple. The numbers are worthless.

As it turns out, we have such high scoring students that it was nearly impossible to move them up. How do you add value to students who already have perfect or near perfect scores? You can't. And I'll bet that the same is true of working with students at the bottom--they are at the bottom because they have reading difficulties, and moving them up a year or more when they are already several grade levels behind must be damn near impossible.

The DOE took a test that the state knows is invalid--and that can be passed just by guessing--and then came up with a bunch of complicated formulas to derive a number that tells a lot of good, hardworking teachers that they suck. In the same vein, there are surely some lousy teachers out there who feel like they can coast now because their numbers were better than expected.

The only bad result that came of my lousy number was that it made me feel awful for a day or so. I felt bad because I care about my students and I take my teaching seriously--that's what makes me (and most of you) a good teacher. I am over it now, because I know it is horse shit and I know it can't be used to evaluate me. Yet. I do wonder how this mess will affect those teachers who are up for tenure this year, and who can be evaluated by that score. How many dedicated and hard working teachers will be refused tenure because their number isn't high enough for the DOE?

And where is the UFT on this one?

5 comments:

Miss Eyre said...

Excellent piece, and thanks for the plug. I had my doubts about the usefulness of these numbers before, and your post here puts the numbers in an even sharper perspective; namely, how hard it is to quantify how much a teacher *adds* to an already very low or already very high score. That, of course, assumes that the tests are in any way valid or reliable, and there's a lot of evidence that these tests are, well, not.

melody said...

It's a well-known fact that it's much easier to move a low-scoring kid up on a standardized test than it is to move high-scoring kid. Besides "ceiling" and "floor" effects (not much room to move up at the top or down from the floor), test scores often contain lots of random measurement error. In any given year's test scores, many low scores are due to negative error, while high scores are due to positive error. So the next year, the low scorers pop up while the high scorers sink down. Real teacher value-added models use sophisticated strategies, including multiple years of data, to deal with these issues -- and even they do not result in highly precise estimates. It's highly unlikely that the NYC DOE uses a sophisticated model. It would be amusing to ask them what the standard error of estimate on your report is. You can be pretty sure that it would be huge, probably not quite large enough to make you statistically indistinguishably from a stump of wood, but nearly so.

Pogue said...

Great post. What I find most interesting is your line, "The DOE...came up with a bunch of complicated formulas to derive a number that..." This sounds exactly like how Wall Street bamboozled everyone. Take numbers, twist them and formulate them so no one understands what the hell they mean other than the twisters. Education-wise, the media will present these stats on the "de-formers" behalf, no questions asked. Newsweek is doing it. NBC with Brian Williams is doing it. Teachers are bad continues while thousands are out of work and losing their homes.

NYC Educator said...

And those out of work and losing their homes ask, "Why do teachers have this stuff?"

They should be asking, "Why don't we have this stuff?"

And when the economy is good, of course, no one seems to envy us at all.

Anonymous said...

At this point the word should be out that you can't accurately rate teachers based on the test scores their students receive.
Random ratings, it turns out, would be just as accurate.