Decision systems that respect privacy, fairness

Friday, August 11, 2017 - 05:22 in Psychology & Sociology

Increasingly, decisions and actions affecting people's lives are determined by automated systems processing personal data. Excitement about these systems has been accompanied by serious concerns about their opacity and threats they pose to privacy, fairness, and other values. Examples abound in real-world systems: Target's use of predicted pregnancy status for marketing; Google's use of health-related search queries for targeted advertising; race being associated with automated predictions of recidivism; gender affecting displayed job-related ads; race affecting displayed search ads; Boston's Street Bump app focusing pothole repair on affluent neighborhoods; Amazon's same day delivery being unavailable in black neighborhoods; and Facebook showing either "white" or "black" movie trailers based upon "ethnic affiliation."

Read the whole article on Physorg

More from Physorg

Latest Science Newsletter

Get the latest and most popular science news articles of the week in your Inbox! It's free!

Check out our next project, Biology.Net