Risk Assessments Spread in Bail Decisions Amid Criticism

The bail reform movement has found an alternative to cash bail: Artificial intelligence algorithms that can scour through large sets of courthouse data to search for associations and predict which people are most likely to flee or commit another crime, the Associated Press reports.  Experts say the use of these risk assessments may be the biggest shift in courtroom decisionmaking since judges began accepting social science and other expert evidence more than a century ago. Christopher Griffin of Harvard Law School’s Access […]

The bail reform movement has found an alternative to cash bail: Artificial intelligence algorithms that can scour through large sets of courthouse data to search for associations and predict which people are most likely to flee or commit another crime, the Associated Press reports.  Experts say the use of these risk assessments may be the biggest shift in courtroom decisionmaking since judges began accepting social science and other expert evidence more than a century ago. Christopher Griffin of Harvard Law School’s Access to Justice Lab, calls the new digital tools “the next step in that revolution.” Critics worry that such algorithms could end up supplanting judges’ own judgment, and might even perpetuate biases in ostensibly neutral form.

States including New Jersey, Arizona, Kentucky, and Alaska have adopted these tools. Defendants who receive low scores are recommended for release under supervision. Algorithms aim to reduce rulings that could be influenced by a defendant’s race, gender, or clothing. The AI system used in New Jersey, developed by the Houston-based Laura and John Arnold Foundation, uses nine risk factors to evaluate a defendant, including age and past criminal convictions. It excludes race, gender, employment history, and where a person lives. It also excludes a history of arrests, which can stack up against people likely to encounter police even if the have done nothing wrong. ProPublica found that a proprietary commercial system called Compas that is used to help determine prison sentences was falsely flagging black defendants as likely future criminals almost twice as frequently as white defendants. The Supreme Court last year declined to take up a case of a Wisconsin man who argued that the use of gender as a factor in the Compas assessment violated his rights.

from https://thecrimereport.org