How Personal Data and Algorithms Send People to Prison

Detroit Today: Should we allow algorithms to sway the criminal justice system?

How much control do we give data and algorithms over our lives? Should they be able to determine how long an inmate stays behind bars? These questions were raised by a recent supreme court case in Wisconsin and addressed by Detroit Today host Stephen Henderson in a recent interview with Barbara Levine, associate director of CAPPS, and University of Michigan professor John Cheney-Lippold, author of We Are Data: Algorithms and The Making of Our Digital Selves.

In the Wisconsin case, the court upheld the six-year sentence of an incarcerated man based on a third-party algorithm that computes a person’s likelihood of violence, recidivism, and pretrial risk. The inmate had no way of defending himself against the formula as it was deemed confidential proprietary information. Studies later found the algorithm, COMPAS, to be racially biased, weighting inmates of color as more high risk than their white peers. COMPAS is used by the MDOC in risk assessments for Michigan inmates.

Cheney-Lippold told Detroit Today that data that processes patterns instead of real people can be dangerous tools. “There’s this computer science term called ‘garbage in, garbage out,’ that if you have racist policing data in the beginning, you’re going to get a racist output,” he said.

Levine spoke more broadly on the problems associated with risk assessments in sentencing. COMPAS, for example, takes into account employment status, school discipline records, and past criminal history—all factors Levine says perpetuate a cycle of young Black males being charged with crimes more often.  “I mean unemployed, prior arrest, describes what, half of Black men in Detroit?” she asked. “You could be high risk and never have committed a violent crime.”

Listen to the entire interview on WDET’s Detroit Today here.