In the book Punished by Rewards, Alfie Kohn describes the perverse effects of incentive systems. Incentive systems fail because it's always easier (and cheaper) to game the system than to actually win. If you incentivize "call length" in a call center, for example, employees will hang up on people. If you incentive "resolutions" the staff will find short cuts to mark calls as "resolved" rather than to actually solve people's problems. He found that the best way to get people to do good work is to compensate them adequately and support their efforts to do good work.
Several years ago I wrote a post about educational measurement in which I described a phenomenon I have come to call the Melamine Effect. In 2007 and 2008, pet food and then milk were found that were contaminated with melamine. The consequences were horrific. In 2007 hundreds of pets were killed. In 2008, tens of thousands of infants were poisoned, many suffering liver damage. Several died. It turned out that unscrupulous people were diluting milk with water and then adding this cheap, industrial chemical — that coincidentally increases the score on a widely-used test for protein content — as a way to increase their profits.
Seeing the effects that standardized testing and so-called education reform were having on schools, I realized that the circumstances are perfectly analogous. If you measure something and use that measure to understand natural systems, you're fine. But if you start looking for treatments that will shift the measure, you're inviting all kinds of perverse effects, because educational measures can't actually measure what people are interested in (i.e. learning or understanding): they only measure factors that tend to covary with them in natural populations. Once you start applying treatments -- especially the cheapest ones -- you're almost assured of toxic effects.
And it should go without saying that just changing a measure doesn't mean you will actually produce better outcomes either. For years, people were told to take niacin to improve their cholesterol test scores. But a long-term study revealed that, although it did improve the scores, those gains were not actually associated with reduced risk of disease.
For a while the term was "data driven", but more recently the term is "evidence based". When people start using words like this, your hackles should rise. Look critically at the underlying model and how it relates evidence to the dimension of interest. This isn't always easy with the "dashboards" of the modern analytics systems. But it's the only way to avoid the "melamine effect".
- Steven D. Brewer's blog
- Log in to post comments