Hi! Welcome Back and Stay Tune! Why using AI in policing decisions risks race and class bias - Mukah Pages : Media Marketing Make Easy With 24/7 Auto-Post System. Find Out How It Was Done!

Header Ads

Why using AI in policing decisions risks race and class bias

TwitterFacebook

AI is rocking the world of policing — and the consequences are still unclear. 

British police are poised to go live with a predictive artificial intelligence system that will help officers assess the risk of suspects re-offending. 

It's not Minority Report (yet) but certainly sounds scary. Just like the evil AIs in the movies, this tool has an acronym: HART, which stands for Harm Assessment Risk Tool, and it's going live in Durham after a long trial. 

The system, which classifies suspects at a low, medium, or high risk of committing a future offence, was tested in 2013 using data that Durham police gathered from 2008 to 2012.  Read more...

More about Artificial Intelligence, Ai, Custody, Durham Police, and Tech

✍ Sumber Pautan : ☕ Mashable

Kredit kepada pemilik laman asal dan sekira berminat untuk meneruskan bacaan sila klik link atau copy paste ke web server : http://ift.tt/2pGxu8i

(✿◠‿◠)✌ Mukah Pages : Pautan Viral Media Sensasi Tanpa Henti. Memuat-naik beraneka jenis artikel menarik setiap detik tanpa henti dari pelbagai sumber. Selamat membaca dan jangan lupa untuk 👍 Like & 💕 Share di media sosial anda!

No comments

Comments are welcome and encouraged on this site. Comments deemed to be spam or solely promotional will be deleted. Including link to relevant content is permitted, but comments should be relevant to the post topic.

Comments including profanity and containing language that could deemed offensive will also deleted. Please respectful toward other contributors. Thank you.