If you’re a fan of value-added, a new paper by Marianne Bitler, Sean Corcoran, Thurston Domina, and Emily Penner, “Teacher Effects on Student Achievement and Height: A Cautionary Tale,” is going to give you the willies. Since hearing the results may shake you up, I do want to warn you that the research is preliminary. Results could change. I’m fairly sure the “cautionary” word in the title is likely to stick around though.
Here’s the heart of what the authors did: They estimated a standard value-added model on data for New York City 4th and 5th grade teachers. Their results look pretty standard–the difference between a good teacher and a bad teacher is real large. What they did next is the very nonstandard thing. This particular data set had also recorded each student’s height each year. So the authors estimated the usual kind of value-added model replacing math and reading gains with height gains. Now we know that teachers don’t have any effect on how fast 4th and 5th graders grow. The willies-generating result is that the estimated teacher value-added on height is nearly as large as it is on academic subjects!
If value-added models seem to show huge effects when we know there aren’t any (height), why should we believe any value-added estimated effect (like academics)?
Precisely because the current results are so shocking, the authors are continuing their investigation. In some pre-preliminary results (i.e. not even written up yet), the authors are finding that if you estimate value-added averaging over a number of years then the spurious effects for height go away while the academic effects mostly remain. If these newer findings prove robust, then one might conclude that value-added models legitimately indicate that teachers are very important for academic gains while still being nervous about the use of models based on a single year’s data