The Algorithm Made Me Do It

Rebecca Wexler:

Right now, in Loomis v. Wisconsin, the U.S. Supreme Court is deciding whether to review the use of COMPAS in sentencing proceedings. Eric Loomis pleaded guilty to running away from a traffic cop and driving a car without the owner’s permission. When COMPAS ranked him “high risk,” he was sentenced to six years in prison. He tried to argue that using the system to sentence him violated his constitutional rights by demoting him for being male. But Northpointe refuses to reveal how it weights and calculates sex.

We do know certain things about how COMPAS works. It relies in part on a standardized survey where some answers are self-reported and others are filled in by an evaluator. Those responses are fed into a computer system that produces a numerical score. But Northpointe considers the weight of each input, and the predictive model used to calculate the risk score, to be trade secrets. That makes it hard to challenge a COMPAS result. Loomis might have been demoted because of his sex, and that demotion might have been unconstitutional. But as long as the details are secret, his challenge can’t be heard.

What surprised me about the letter from Eastern was that its author could prove something had gone very wrong with his COMPAS assessment. The “offender rehabilitation coordinator” who ran the assessment had checked “yes” on one of the survey questions when he should have checked “no.” Ordinarily, without knowing the input weights and predictive model, it would be impossible to tell whether that error had affected the final score. The mistake could be a red herring, not worth the time to review and correct.