Computer Program That Calculates Prison Sentences Is Even More Racist Than Humans, Study Finds


Andrew's avatarMulticultural Meanderings

Not surprising that computer programs and their algorithms can incorporate existing biases, as appears to be the case here:

A computer program used to calculate people’s risk of committing crimes is less accurate and more racist than random humans assigned to the same task, a new Dartmouth study finds.

Before they’re sentenced, people who commit crimes in some U.S. states are required to take a 137-question quiz. The questions, which range from queries about a person’s criminal history, to their parents’ substance use, to “do you feel discouraged at times?” are part of a software program called Correctional Offender Management Profiling for Alternative Sanctions, or COMPAS. Using a proprietary algorithm, COMPAS is meant to crunch the numbers on a person’s life, determine their risk for reoffending, and help a judge determine a sentence based on that risk assessment.

Rather than making objective decisions, COMPAS actually plays up racial biases…

View original post 659 more words

Unknown's avatar

About agogo22

Director of Manchester School of Samba at http://www.sambaman.org.uk
This entry was posted in General. Bookmark the permalink.

Leave a comment

This site uses Akismet to reduce spam. Learn how your comment data is processed.