UK POLICE WILL START USING AI TO DECIDE WHETHER SUSPECTS SHOULD BE KEPT IN CUSTODY

[5/21/17]  UK police in the city of Durham, England, are prepared to go live with a predictive artificial intelligence system that will determine whether a suspect should be kept in custody, according to the BBC. Called Hart, which stands for Harm Assessment Risk Tool, the system is designed to classify individuals based on a low, medium, or high risk of committing a future offense. Police plan to put it live in the next few months to test its effectiveness against cases in which custody specialists do not rely on the system’s judgement.

The AI assessment could be used for a number of different determinations, like whether a suspect should be kept for a longer length of time and whether bail should be set before or after a charge is issued. According to the BBC, Hart’s decision-making is based on Durham police data gathered between 2008 and 2013, and it accounts for factors like criminal history, severity of the current crime, and whether a suspect is a flight risk. In initial tests in 2013, in which suspects’ behavior were tracked for two years after an initial offense, Hart’s low-risk forecasts were accurate 98 percent of the time and its high-risk forecasts were accurate 88 percent of the time.

Hart is just one of many algorithmic and predictive software tools being used by law enforcement officials and court and prison systems around the globe. And although they may improve efficiency in police departments, the Orwellian undercurrents of a algorithmic criminal justice system have been backed up by troubling hard data.

In a thorough investigation from ProPublica published last year, these risk-assessment tools were found to be deeply flawed, with inherent human bias built in that made them twice as likely to flag black defendants as future criminals and far more likely to treat white defendants as low-risk, standalone offenders. Many algorithmic systems today, including those employed by Facebook, Google, and other tech companies, are similarly at risk of injecting bias into a system, as the judgement of human beings was used to craft the software in the first place.


For almost a decade Gov't Slaves has worked tirelessly to bring its readers the most critical news the corporate media does not want you to see. We have no intrusive ads, pop-ups or clickbait, just NEWS. If you happen to be in a position to support our work, PLEASE consider making a one-time donation below or a monthly recurring donation HERE. Your support is humbly appreciated. Gov't Slaves


100% Secure via Pay Pal. All major CC accepted.

$
Personal Info

Donation Total: $5.00