Saturday, May 14, 2011

The Value Added Lie

Maybe some math teacher can help me here. I'm trying to understand the numbers behind the controversial Teacher Data Reports issued to NYC teachers. These TDRs rank English and Math teachers in the 4th through 8th grades by assigning a percentile based upon the alleged "value" the teacher added to his or her students scores. This number is increasingly important as Governor Cuomo pressures the UFT to accept these value-added scores as 40% of a teacher's evaluation.

Indefatigable blogger Reality-Based Educator often pegs the margin of error of these numbers at 12-35%, while the UFT "...claims the average margin of error is plus or minus 27 points, or a spread of 54 points". Even the sample TDR the DOE provides shows a MOE of +/- 25 points (although, in typical DOE doublespeak, the report calls it a "range" and not a margin of error).

Now, if any of these numbers is correct, or anywhere near correct, it's clear that these numbers are garbage. The sample DOE report shows a teacher with a 50 percentile rank, who may be as low as about 22% to as high as 72%.

I'm no statistician, but I am a baseball fan, so I can understand and explain why these numbers stink. A baseball team ranked at the 50 percentile would be perfectly average. But a team that won 22% of its games would be the worst team in major league history, while the team with a winning percentage of 72 would be the greatest team of all time. Baseball fans, who tend to eat up crazy stats, would spit on value-added because it doesn't mean anything.

I understand that the "margin of error" is meant to show the range into which a teacher may fall in a given year. But I would argue that that number is even more meaningless than it appears when we look at multiple years. I'll use myself as an example. Two years ago, my TDR placed me at the very bottom of the pile, with a single digit score. According to the report, the highest score I could have attained given the margin of error was a 33. Yet this year, I scored at the very top, and the lowest score I could have attained according to the report is an 83.

So, according to these reports, even given the margin of error, there was a 50 point difference between the best teacher I could have been one year, and the worst teacher I could have been the next year.

That is 50 points beyond the margin of error.

Some math maven will likely point out that this result is over two years, and the value-added score only measures one year, but I really don't see how that matters.

I am the same teacher, in the same school, teaching the same subject to the same grade, using the same curriculum and lessons, and my score changed almost 90 percentage points.

Perhaps my results are extreme, but they happened. I've spoken to many teachers who've had drops or spikes nearly as large. To me, that means that just about anyone can find himself in danger hitting the bottom and becoming a target of administrators.

If any math teachers care to explain where my analysis went wrong, I'd like to hear. Or perhaps I'm right, and the value-added numbers just don't add up to much.

,

3 comments:

Michael Fiorillo said...

The judge who permitted the publication of the TDRs stated that their lack of validity was immaterial; - not so different from Antonin Scalia saying that a death row inmate's innocence should not interfere with their execution - and inadvertently pointed to the harsh truth of the situation.

These scores, like the tests themselves, although they may have a pseudo-scientific veneer to soothe some delicate sensibilities, are weapons to be used against tenure, seniority, the very underpinnings of the union, and teaching as a career. They are the lever to effect the "disruptive innovation" of the traditional labor markets and labor power in the schools.

It's no coincidence that attacks against tenure and seniority are occurring simultaneously with the increased digitalization and outsourcing of instruction.

ASTRAKA said...

A. Talk,
Here is a good place to start.
http://www.ams.org/notices/201105/rtx110500667p.pdf

I have posted in the past about the misuse of value added scores. In fact, I have stated that our union's leadership accepted this idea with minimum research and complete ignorance of the subject "value-added models (VAMs)". It was criminal negligence on their part. I can never forgive them for their ignorance and their stupidity.
They have allowed the pseudo-reformers to misuse this tool (VAMs) in their shameless destruction of public education.

zulma said...

I teach math at the HS level and teachers at that level don't deal with those ridiculous TDR.

However, I really don't understand how they figure out the percentile and it is the worse fuzzy math I have ever read! How can you add value to a student who is an unpredictable variable? It is another form of the using the instability of student population to grade teachers. If I have three students in my class who have parent involvement, good attendance, and they apply themselves in class, they will show progress in my class. If two of the students move to another class, another borough, out of the city, etc. and they are replaced with two students with no parent involvement, or live in a shelter, with academic problems and emotionally challenged, then my TDR for the school year will be greatly affected for me and the next teacher who receive those students the following term.

What the DoE should be doing is coming up with programs to help those students that are confronted with so many challenges in their personal life instead of wasting so much time, energy and money and trying to destroy a teacher's career with fuzzy math that is not applicable to teaching.