PDF Publication Title:
Text from PDF Page: 023
A Case Study: Artificial Agents in the Criminal Justice System The U.S. criminal justice system is increasingly resorting to algorith- mic tools. Artificial agents help ease the burden of managing such a large system. But any systematic algorithmic bias in these tools would have a high risk of errors and cumulative disadvantage. We first look at the use of algorithms at the sentencing and parole phase. Angwin et al. (2016) reported on Northpointe’s Correctional Offender Management Profiling for Alternative Sanctions (COMPAS) criminal risk assessment system. This software is used in sentencing and parole hearings across the country. Angwin et al. presented anec- dotes showing the system misrepresenting the recidivism risks of differ- ent convicts. These anecdotes first hint at a systematic racial bias in the risk estimation. Black convicts were being rated higher than nonblack convicts, even when the nonblack convicts had more-severe offenses. The authors follow up on this hint with analysis of COMPAS and recidivism data from Broward County, Florida. Larson et al. (2016), detailing the statistical analysis in Angwin et al. (2016), found that Black defendants were twice as likely as white defendants to be misclassified as a higher risk of violent recidivism, and white recidivists were misclassified as low risk 63.2% more often than black defendants. Police departments are also resorting to algorithmic tools for pre- dictive policing and allocating resources. We present an example of a simulation showing how a mathematically acceptable algorithm results in inequitable predictive policing behavior. Figure 1 illustrates how a mathematically effective algorithm for finding criminals based on his- torical crime data can lead to inequitable behavior. The figure illus- trates the simulated behavior of an automated system for directing law enforcement efforts in finding and responding to criminal events in the whole population. Suppose we have a population that splits naturally along catego- ries (e.g., location, gender, crime type, or any other criterion) and that Algorithms: Definition and Evaluation 13PDF Image | Intelligence in Our Image Risks of Bias and Errors in Artificial Intelligence
PDF Search Title:
Intelligence in Our Image Risks of Bias and Errors in Artificial IntelligenceOriginal File Name Searched:
RAND_RR1744.pdfDIY PDF Search: Google It | Yahoo | Bing
Cruise Ship Reviews | Luxury Resort | Jet | Yacht | and Travel Tech More Info
Cruising Review Topics and Articles More Info
Software based on Filemaker for the travel industry More Info
The Burgenstock Resort: Reviews on CruisingReview website... More Info
Resort Reviews: World Class resorts... More Info
The Riffelalp Resort: Reviews on CruisingReview website... More Info
CONTACT TEL: 608-238-6001 Email: greg@cruisingreview.com | RSS | AMP |