Back in 2010, the Los Angeles Times published value-added scores for LA teachers. This may have been good journalism, but from the point of view of helping education the appropriate technical term is “dumbass.” Hey, value-added is an important tool. Judging a teacher on nothing but value-added is ridiculous. More importantly, how many employers conduct job evaluations in public? Bad LA Times.
Bad idea or good, it’s useful to look back and see what resulted from the Times decision. Peter Bergman and Matthew Hill have taken a look at the effect of the publication of VAM scores. It turns out that ratings were not published for teachers with fewer than 60 student VAM scores. Presumably, teachers just above the publication cutoff were not systematically different from those just above. So looking at results for teachers with around 60 student scores identifies the effect of publication.
Some of the findings:
- Highly rated teachers subsequently found themselves teaching better students and low-rated teachers got worse students.
- Notably, teachers rated highly compared to other teachers in a school were the ones who subsequently picked up better students.
- No noticeable effect on causing teachers to leave.
So it looks like the main effect of publication was to let more savvy parents arrange for their kids to get the better teachers.
One caveat: Teachers with around 60 student scores are likely to be relatively inexperienced. The results might be different for more experienced teachers.