California just replaced cash bail with algorithms
But algorithmic risk assessment makes the details of how this new legislation is implemented all the more important—while some activists say that cash bail created a predatory industry around keeping people tied to the criminal justice system, others point to the biases against people of color documented in algorithmic risk assessment tools used in the past.
“It is a tremendous setback,” Raj Jayadev, a coordinator at civil rights activist organization Silicon Valley Debug, told Quartz. “This will, in our analysis, lead to an increase in pretrial detention.”
That’s because the machine learning systems used to calculate these riskscores throughout the criminal justice system, have been shown to hold severe racial biases, scoring people of color more likely to commit future crimes. Furthermore, since private companies have been typically contracted to offer these services, the formulas derived by machine learning algorithms to calculate these scores are generally withheld as intellectually property that would tip competitors to the company’s technology.