logo

Intelligence in Our Image Risks of Bias and Errors in Artificial Intelligence

PDF Publication Title:

Intelligence in Our Image Risks of Bias and Errors in Artificial Intelligence ( intelligence-our-image-risks-bias-and-errors-artificial-inte )

Previous Page View | Next Page View | Return to Search List

Text from PDF Page: 023

A Case Study: Artificial Agents in the Criminal Justice System The U.S. criminal justice system is increasingly resorting to algorith- mic tools. Artificial agents help ease the burden of managing such a large system. But any systematic algorithmic bias in these tools would have a high risk of errors and cumulative disadvantage. We first look at the use of algorithms at the sentencing and parole phase. Angwin et al. (2016) reported on Northpointe’s Correctional Offender Management Profiling for Alternative Sanctions (COMPAS) criminal risk assessment system. This software is used in sentencing and parole hearings across the country. Angwin et al. presented anec- dotes showing the system misrepresenting the recidivism risks of differ- ent convicts. These anecdotes first hint at a systematic racial bias in the risk estimation. Black convicts were being rated higher than nonblack convicts, even when the nonblack convicts had more-severe offenses. The authors follow up on this hint with analysis of COMPAS and recidivism data from Broward County, Florida. Larson et al. (2016), detailing the statistical analysis in Angwin et al. (2016), found that Black defendants were twice as likely as white defendants to be misclassified as a higher risk of violent recidivism, and white recidivists were misclassified as low risk 63.2% more often than black defendants. Police departments are also resorting to algorithmic tools for pre- dictive policing and allocating resources. We present an example of a simulation showing how a mathematically acceptable algorithm results in inequitable predictive policing behavior. Figure 1 illustrates how a mathematically effective algorithm for finding criminals based on his- torical crime data can lead to inequitable behavior. The figure illus- trates the simulated behavior of an automated system for directing law enforcement efforts in finding and responding to criminal events in the whole population. Suppose we have a population that splits naturally along catego- ries (e.g., location, gender, crime type, or any other criterion) and that Algorithms: Definition and Evaluation 13

PDF Image | Intelligence in Our Image Risks of Bias and Errors in Artificial Intelligence

intelligence-our-image-risks-bias-and-errors-artificial-inte-023

PDF Search Title:

Intelligence in Our Image Risks of Bias and Errors in Artificial Intelligence

Original File Name Searched:

RAND_RR1744.pdf

DIY PDF Search: Google It | Yahoo | Bing

Cruise Ship Reviews | Luxury Resort | Jet | Yacht | and Travel Tech More Info

Cruising Review Topics and Articles More Info

Software based on Filemaker for the travel industry More Info

The Burgenstock Resort: Reviews on CruisingReview website... More Info

Resort Reviews: World Class resorts... More Info

The Riffelalp Resort: Reviews on CruisingReview website... More Info

CONTACT TEL: 608-238-6001 Email: greg@cruisingreview.com | RSS | AMP